Immersion: AR Book Project
My Experience Creating an Immersive AR Book
I had an interesting opportunity as a student at UVU that most students don’t normally have. Our class in collaboration with an English class worked together to create an immersive AR book covering Endangered animals and Mass Extinction events. It was an exciting opportunity, and we enjoyed working together.
The team comprised two designers (that is including myself) and two writers. The writers did the research and writing portion, and we as designers would put a design to match their vision, as well as create Augmented reality experiences with Zapworks.
At the beginning of the project, I helped with starting things, such as coming up with drafts of ideas of colors, icons, layouts, etc., so we can have a vision of what we could do. I also played more of a project manager role in this project, making sure that we were on the same page as the English students, that we had time to meet, and that everyone had a voice in our meetings.
I also helped with making a rough draft of our layout, but I did let my other designer have the last say on the layout as they had an eye for it. So for my main role, I was actually going to be the authority on the AR features specifically because of my experience already with Zapworks with my Redwall project, and also with Adobe Aero because of my Augmented Christmas Card project.
We had to start somewhere, and as always with a project, I had us start with a wishlist of features based on the brainstorming on the content we had with the English students. The goal was to make 8 total AR experiences for the book we were going to create. We came up with the following list:
- Animated cover
- Slideshow info cards
- Atmospheric sound / Music while you read
- 360 image of an aquarium or zoo
- Animated Section Intro
- Video of Animals or Conservation Efforts in the US
- Link to Donation Page
- Works Cited
This was an ambitious list that didn’t stay the same but at least gave us a starting point. As we went through the content as it was being created by the English students, and as we had to make sure to keep ourselves aware of how much time we designers actually had for the project, we came up with the following instead:
- Cover transformation
- Spotify Link
- Animated Mass Extinction Timeline Infographic
- Decision Tree Diagram Legend and Glossary
- Endangered Species List Infographic
- Section Break Animation
- 3D Model of a California Condor Skull
- Works Cited
I worked on the Infographics and Glossary, as well as the Section Break Animation, and getting the 3D model of the California Condor skull to work in AR.
Zapworks Designer Beta
For this whole project, I worked with the beta of Zapworks Designer. It was interesting to see how the beta changed while working on our projects, as more features become available (such as more 3D model file formats being supported). I was intimidated since I have never used the designer portion of this before, but it was rather easy to work with.
Using Familiar Software
Most of the AR assets I made were created using either Adobe Illustrator, Adobe Indesign, Tumult Hype, or a combination of the three. For example, I used Illustrator to recreate the Decision Tree one of our writers wanted to showcase in the completed book.
This was one of the useful ways I found to use AR for a scholarly reason. The Decision Tree was hard for me to grasp, and had a lot of words that I didn’t quite understand. After doing my own research on it, I changed the words to icons and added a legend and glossary through Zapworks.
Learning New Software
I originally wanted a 3D model for the project. Unfortunately, most file types for free 3D models were not supported… not until about 2 weeks ago when Zapworks updated the Beta. The format glTF became supported, and the possibilities of adding a 3D model to our project became real. I found a Condor Skull from Sketchfab that had a download option in glTF. It was posted by the National Park Service Geologic Resources Division, and I knew it would fit our project.
When I first tried to drop it into Zapworks, I ran into an issue. This 3D model was a very large file, and the download did something odd with the way it was formatted. I had to reformat it and compress it using Blender. Blender is a very complicated 3D model creator, and a lot of its features go over my head, but with the help of my husband who is a wiz at Blender, we were able to simplify the 3D model to make it fit the 10 MB file size limit, as well as format the glTF correctly. The results are great!
Putting it All Together
I put all the assets together in the beta Designer and worked on making sure the picture being tracked was a decent file size, and that the items that would pop up would be readable. I learned a couple of things: the tracking image (what the zapcode will have you look for to pull up the AR experience) doesn’t need to be high resolution for it to work. Also, the image tracking isn’t too picky about needing to see the whole image. As you can see in the skull image above, a piece of the image was taken up by the zapcode itself, but it doesn’t stop the 3D model from showing up. I was worried when my other designer wanted to implement this kind of design into the book, but after testing it, it turned out to be fine.
There are still some tweaks that definitely need to be made with the beta. Moving images around in the space is kind of difficult, and in order to make it go where you want, you need to use the coordinates feature, which tells where the item is on the page. It was all able to work out in the end though.
Conclusion of the Project
We now have a 55-page book complete with AR features and a design to help tie all our ideas together. While there were things I wish we could do differently in this project (there’s always something to improve in anything), we came away from this experience with a really fun deliverable. I enjoyed working with everyone on this project, and hope for more experiences like this.