Learning to Code for Augmented Reality

In my late thirties I presented myself with a challenge: learn to code. Not to become a programmer per se, but to better understand the relationship between human/computer interaction and the language that controls it. Along the way I have developed a niche expertise in 3D programming.

It turns out knowledge of 3D is incredibly relevant for augmented reality! I am a visual tactile tangible learner. I like props and the ability to hold some representation of a concept in my hand.

I think there are many people like me who learn best when we can see, hold, touch, feel and interact. Coding presents a unique challenge to us because it is largely linear, sequential, and text driven. As coding environments mature, they become better at incorporating visual representations of code.

I am an educator and filmmaker and my expertise lies in how to teach and using media to support that learning. Part of my motivation in learning to code is learning how to teach coding. I believe augmented reality and 3D presents a great learning landscape for design focused visual/spatial/tactile learners.

I have been enrolled in an iOS certificate course at University of Washington for the past 6 months. I am in my third and final module. The course has been a great foundation for modern app architecture using Apples new Swift (Now in 4.

0 and getting more stable with each release) Many in the course have a computer science background and years of experience coding. They are topping off where I am filling up my tank. Learning Swift has been like learning Italian.

Both made me feel stupid and unable to articulate complex ideas initially. To extend the metaphor, building my first To-do list app that used Core Data was the equivalent of ordering cheese in a supermarket in Italy. Thats confusing.

Early on in my journey, I was drawn to the lines of code that produced 3D visuals on screen. It used a camera based metaphor that was accessible to me with my filmmaking background. I became fascinated by adding interaction to the elements I created on screen.

The UW course was focused on the building blocks of Apples Swift language and use of Xcode. I have gained a fluency in reading code and, crucially, learning between good and bad (lazy) practice and how to debug more effectively. There are many paths up the mountain to creating a piece of software and much of whats out there offers a dubious trail.

I was very excited when Apple announced ARkit, their Augmented Reality API. I was amazed to learn that it was built on the 3D framework called SceneKit I had been teaching myself! The result is that I could port existing projects into XCode 9 beta and start running my iOS 11 apps in an AR environment on my phone in minutes.

I am fascinated by the potential that AR holds and it is early days with a lot of examples that showing the novelty of AR versus the utility of AR. I am confident that novelty will give way to utility and we will adopt it quickly.ARkit for Apple feels like a massive multiyear on boarding effort with 1 billion iOS users.

Only a fraction of this user base will be able to use AR when it launches in the fall, but a fraction of billion is still hundreds of millions of people who can evangelize the technology. It will give the platform time to mature as developers work to figure out the best and most effective uses of AR.In these nascent days of augmented reality development, it has been exciting for me to follow and contribute my projects.

I have a particular interest in how to manipulate and interact with 3D objects in an AR environment. To this end I have been developing a simple example that allows me to demonstrate my efforts. Currently I have an app that allows me to place 3D objects in space and then move them around in X,Y,Z space with both position and rotation.

The example below shows the latest version of what I am working on. This example is really a testament to the tools Apple has created to help me develop. They are stable, intuitive and just work.

My development and learning as a programmer has really accelerated in the past couple months and I am excited to share it with you.

HOT PRODUCTS
no data
GET IN TOUCH WITH US
recommended articles
Related Blogs blog
3 Things to Know About Apples ARKit
Genuinely excited. Maybe those were the words I was looking for. But wait, I should not get ahead of myself. When we did In Shadows in 2014 it was a concept for an augmented reality iBeacon enabled game of tag. The worlds first! Revolutionary at the time. Maybe too revolutionary. Despites runs in Singapore and Tokyo, it failed spectacularly.The app failed for many reasons, and we would have rebooted the app, if it wasnt for the fact that we used the Metaio SDK and Metaio, the German company was soon after our launch bought by Apple. After that the SDK disappeared and I was left in the dark when Apple would have completed integrating it into their ecosystem. Now with the upcoming release of iOS11, this time seems to have come. And by the promise of what ARKit might be able to do I am genuinely excited. Of course there was the unreal Wingnut AR demonstration. But thats not it alone.Dual camerasFinally it makes sense. With the impeding launch of iOS11, literally overnight millions of iDevices will evolve into the worlds largest augmented reality platform. Boom! And finally I understand the vision why Apple installed dual cameras on every iDevice launched since the iPhone 7; even though I still believe Apple dropped the ball on the headphone jack. Dual cameras are good for better zoom capabilities but also better depth sensing. If you have two different viewpoints and you know the distance between these these two viewpoints, it is possible to triangulate the distance of a given object point. That means for every pixel the camera captures, the phone can calculate depth maps of what the camera sees. Our brains work in a similar way allowing us to perceive our world in 3D. The effect is called stereopsis. The benefit of this technology is that the software could gain an understanding which objects are in the foreground and which documents are in the background. The relative positions of objects to each other. One example of what then can be done is that the background can blurred. An effect known as bokeh.Irrespective of the visual aesthetic, this technology provides ARKit with information about real world objects with digital information.A.k.a. World TrackingWorld tracking aims to create the illusion that the virtual world is part of the real world. That includes correct shadows, changing scale and perspective, and hit-testing digital props on flat surfaces. World tracking does not try to create a representation of the world but works through a technique called visual-inertial odometry (VIO). VIO pins objects to a points and via this tracks these points across images in the video signal. ARKit estimates the relations between these points through 3-dimensional projective geometry.The steps to take here are usuallyfind matching points across multiple imagescompute the essential matrices (E) between image pairsdecompose E to obtain relative poses,and finally performing bundle adjustment (BA) to further improve the estimates.Clearly, the better the relation estimates are the more accurately ARKit is able to track points in the environment through the iDevices camera and motion sensors. Of course the existing technologies like hand gesture recognition, accelerometers, bluetooth LE, and of course GPS combined with ARKit provides a treasure chest of opportunities for us app developers.Acquisitions of relevant technologySurely we all know Apple and despite the continually bad press, they have not arrived here by accident. Besides the aforementioned Metaio, Apple acquired Linx a company that had developed mobile technology for multi-aperture camera models which can enable effects like background focus blur, parallax images and 3D picture capture. Remember bokeh? In addition, in 2013 Apple acquired PrimeSense the company that licensed their 3D-sensing technology to Microsoft for the Kinect. Go figure.I truly believe we are looking into the first millions-large platform for augmented reality making apps like these viable again. Is that the end for Magic Leaps? Time will tell. We at tenqyu; however, will re-visit our AR apps and surely look into relaunching some of our previous concepts with ARKit enabled. Great times ahead.Btw, here is the trailer from back in the day for your enjoyment.Hacker Noon is how hackers start their afternoons. Were a part of the @AMI family. We are now accepting submissions and happy to discuss advertising & sponsorship opportunities.If you enjoyed this story, we recommend reading our latest tech stories and trending tech stories. Until next time, dont take the realities of the world for granted!.
no data
  • E-MAIL: angelaliu@aofei-stone.com
  • PHONE : +86 13859989513
  • WHATSAPP : 13859989513
  • ADD : rm No.29e Zhongxinhuiyang Building No.55 North Hubin Road Xiamen China
AOFEI BUILDING MATERIALS
  • Xiamen Aofei Building Materials Co., Ltd. is adhering to the spirit and attitude of serving customers' needs and is willing to offer quality products and services to friends all over the world.
Copyright © 2021 Xiamen Aofei Building Materials Co.,Ltd.  | Sitemap