Caitlin Fisher’s Circle was installed at the Electronic Literature Organization Conference Media Art show in June 2012 and awarded the “Jury’s Choice” Award.
““Circle” is an augmented reality tabletop theatre piece that tells the story of three generations of women through a series of small stories. The first version of this piece was created using a custom marker tracking system and the user interacted with the piece by exploring the markers with a webcam, triggering small poetic voiceovers and videos. The version being premiered here was built in Unity and uses natural feature tracking — the black and white markers of the earlier version are replaced by objects and photos. The user interacts with the piece by holding up an iPad or smartphone as a magic looking glass to explore the story world.”
The installation was built for the iPad using Qualcomm’s Vuforia Augmented Reality SDK.
The popular ARToolkit library has been ported to work with Max/MSP. This simple 4 marker object is released under the GPL License and available from Aranar Productions. For more information, contact ARToolworks.
What happens when technology breaks or behaves in unexpected ways? Why is this important to embrace? Papagiannis will speak to her PhD research in interactive art installations in Mixed Reality and Augmented Reality.
Helen Papagiannis is an artist, designer, and researcher specializing in Augmented Reality (AR). Hailed as being among the top 10 forces currently shaping the AR industry (Leading AR News Blog, Games Alfresco), Papagiannis has been working with AR since 2005 exploring the creative possibilities and theoretical implications for this exciting emerging technology. Recently, Papagiannis’ interactive artworks were featured in an exhibition at the Ontario Science Centre. She is presently completing her Ph.D. in Communication and Culture at York and is a Senior Research Associate at the Augmented Reality Lab (Department of Film, Faculty of Fine Arts). Prior to her graduate studies, Helen was a member of the internationally renowned Bruce Mau Design studio, where she was project lead on Massive Change: The Future of Global Design.
The Amazing Cinemagician installed at the York Glendon Campus for TEDx. Helen Papagiannis installed the exhibit which debuted this summer at the Ontario Science Centre and was featured as a TEDx presenter. We were proud to support one of our senior researchers at this event. Visitors were offered the delightful digital cinematic prestidigitation as they entered the Manor House on Glendon Campus using RFID technology and a walkthrough Fogscreen projection screen.
Lightning Talk by Andrew Roth: How to Make your own Augmented Reality Popup book in 5 minutes. Based on the projects we did with the Toronto Museum Project and the Ontario Science Centre we are distributing a DIY kit for creating your own FLARToolkit pop-up book. See it in action at osc , Toronto , Requiem ,
click on the links on each page to download the source. Each source contains a reusable .swf you can populate with images just by editing the file playlist.xml. You can even make your own marker, create a pattern at Marker Generator Online , add it to the data folder and just swap out the name of your new .pat in pattern path=”data/oscmkrb16.pat” in the file flarConfig.xml. Download the Source.
“Thank you so much to the organizers at Site 3 for hosting this event and inviting me to present”.
The presentation slides available on Andrew Roth’s website:
We would like to thank the Ontario Science Centre again for allowing us to participate in the Amazing Cinemagician installation. If you missed it at the centre here are some images from Helen Papagiannis:
In 2010, the Banff New Media Institute revitalized their VR CAVE. In keeping with work being done in the Augmented Reality Lab at York University, the framework used in the lab was made to be easily repurposed, portable, and capable of running on both a laptop or multiple computers.
1. Create a server patch using cave.server and cave.send objects.
2. Modify cave.world (see the Readme File for details) to configure the number and positioning of the screens. You will also likely have to reconfigure cave.audio to match the speaker layout in your cave.
3. Create a single renderer patch using cave.world and cave.receive objects.
See cave.world.maxhelp for examples.
The newest in the set of helper objects for Cosm in order to create Augmented Reality environments. Greatly simplified workflow in order to get you up and running quickly. Try it on your own machine, you don’t need a tracking system in order to take advantage of the navigational hotkeys supplied by the Cosm objects.
*Requires a full install of the Cosm library and the Intersense tracking .dylib. See is900.maxhelp for instructions.
Fixed: June 28, 2010 – Fixed Intersense transformations for room dimensions
Fixed: June 25, 2010 – installs to proper folders (Max5/examples/arlab)
arlab.bonk: greatly simplified collision reporting.
arlab.hand: a virtual hand which can be used for testing touch interactions.
arlab.mov: simplified videoplanes with spatialized sound, supports drag and drop.
arlab.sound: simplified soundfile with spatialized sound, supports drag and drop.
arlab.world: patcherargs allow you to define the the size of your own workspace.
is900.mxo – fixed for Snow Leopard Compatibility, performance enhancements.
Some free models (.obj).
7 lessons on interactivity and building your world (in Max5/examples/arlab).
Experience cutting-edge technological wizardry that blurs the lines between art, design and science in The Amazing Cinemagician: New Media Meets Victorian Magic, opening May 29 at the Ontario Science Centre’s Idea Gallery.
The exhibition features two interactive installations by new media artist and York University PhD student Helen Papagiannis that use augmented reality (AR) technology, fog screen and radio frequency identification (RFID).
“It’s as simple as it is stunning. With the use of ordinary tap water and digital technology, FogScreen projection screen enables projected images to literally float in the air, creating a brand new medium to captivate and fascinate audiences. You can walk right through a FogScreen projection screen without getting wet. The microscopic fog droplets actually feel dry to the touch, just like air.” –fogscreen.com
This particular screen is owned by the Future Cinema Lab and resides in the Augmented Reality Lab. It has won a great deal of attention from its use in Nuit Blanche, at live performances at York (in the AGYU and Department of Dance), and will be at the Ontario Science Centre soon (Details to come)…
The iPhone Development Workshop series is a free workshop for enthusiasts and academics interested in understanding the capabilities of mobile devices. Focusing on the iPhone as a successful “all-in-one” platform we discuss what’s possible, what’s not, and what’s next in developing for mobile devices. This workshop series is hosted by the Augmented Reality Lab at York University as part of the Future Cinema Lab.
Living Postcards is an interactive display designed for use at the Future of the Internet Conference, Prague (2009) and the Canada 3.0 Conference, Stratford (2009). Users approach the booth and can watch a mirrored projection of themselves holding a movie clip in the palm of their hands. By holding a postcard imprinted with a black and white marker users are greeted with a variety of images and have a picture taken of their experience. With the users permission, that image is then uploaded to an online gallery. The demonstration was meant to highlight Canadian technological advancements in the fields of augmented reality, storytelling, and by extension, digital new media by offering in an engaging, handheld experience. You can view the galleries through the links below.
52 Card Psycho is an installation-based investigation into cinematic structures and interactive cinema viewership. The concept is simple: 52 cards, each printed with a unique identifier, are replaced in the subject’s view by the individual shots that make up a movie scene. The cards can be stacked, dealt, arranged in their original order or re-composed in different configurations, creating spreads of time. The technology used is marker-based augmented reality, where special printed markers are recognized in the video feed and pass data regarding their unique identifier, their position, and their orientation. The computer then feeds a display overlaying the video clips of each shot onto the appropriate card and continually mapping their position and orientation. 52 Card Cinema has been presented at ISEA in Singapore 2008, at Imagine RIT 2009, Rochester, NY, and as a Juried Exhibition, ISEA in Belfast 2009.
Handheld City is an online streaming experience developed by the AR Lab for the city of Toronto’s virtual museum project, which launched March 6 (Toronto’s 176th birthday). Using AR as a storytelling device, the researchers organized and animated the digital objects in the museum collection and created an interesting way to interact with the objects and access the accompanying text.
You can download the code and make your own handheld city!
A collection of abstractions that can be used to connect to Intersense tracking grids (IS 900) and work as a complement to COSM for Max/MSP by the Media Arts & Technology Program at UC Santa Barbara, USA. Objects include abstractions for communicating to individual trackers by connecting them to cosm.nav objects, working with a Video Head Mounted Display as a background image and the is900.mxo itself. NOTE: you will have to perform an install of the COSM objects and the dynamic library from Intersense. Links are included in the is900.maxhelp file.