This week I spent a lot of time attempting to create virtual artifacts using the ARC 3D program. The ARC 3D program was made by EPOCH, European research network on excellence in Processing Open Cultural Heritage, that created this program for free use to virtually simulate artifacts. In theory it sounds very simple: download program, create username & password, upload photographs of artifacts, wait, open the files they send back. Sadly my experience wasn't quite so stress free. One vital stipulation that Clark and I were hoping could be overlooked is the way in which the pictures are taken. When we were taking photographs it was easiest to set the lights up stationary at the edge of a table and then move the artifact. However, because this makes the light inconsistent in relation to the vessel (when the vessel moves the light then shines on a different side of the vessel) the program mailed back a giant unconnected blob of nothingness. Even when I attempted to take photographs as they direct the light was still too inconsistent because I was using primarily overhead lights. Adam had the idea of placing an LED light (no heat to damage the artifact) on a turntable with an artifact, turn off the rest of the lights, and then photograph it from all angles. While this may work in theory I feel that it would be too time consuming of a process and we should move on to look for other ways to create virtual artifacts.
Since the ARC 3D program didn't work I've been looking for different examples of successful virtual archaeology. This website, while older than I would like, is a good example of showing artifacts where they would be found in a site and then being able to interact with the artifact directly. The walk through of the kiva itself is even well done. One problem with it is that the viewer is fixed within the space, they cannot zoom in or out or have a lower or higher viewpoint. I envision the Pachacamac project as more of a video game where the visitor can interact with the landscape which means varying their point of view and allowing them to move throughout the space. Currently I'm systematically going through this site which lists lots of wonderful virtual archaeology pages. However, I'm having some issues working on my laptop and will dig into them further when I get to the lab. Of special interest to me is the Paloma World link that advertises being able to virtually participate in an archaeological dig.
The Giza Project is in contention for my favorite 3D reconstruction of an archaeological site. While they did rely on archival photographs, their reconstructions of the above ground and even subterranean spaces are incredible. I find the expanse of the mastabas amazing. Granted, it's 'sterile' looking but I think it's something to strive for. My favorite part is when the camera cuts underground and you can see the subterranean chambers from straight on. With Clark talking about plugging Uhle's photos into the reconstruction it would be interesting to then have the burial chambers intact in the simulation.
Finally, I participated in an eye tracking experiment yesterday. This is in response to a paper that Carly found a week or two ago discussing the usefulness of eye tracking technology within visual heritage contexts. The way that eye tracking is achieved without the use of head gear is to beam a light into the eye, which is then reflected by the retina and picked up by a receptor on the same plane as the light source. Even though it sounds like a great idea in theory, there are problems that make it not possible for museum use yet. One problem is that visual correction apparatuses (eye glasses and contact lenses) cause a large error. I wear soft contact lenses and my eyes were not able to be accurately tracked. Another problem is that the reflection from any eye makeup causes for false readings also. Eye tracking is a great idea for finding out how the eye moves within a museum display and may want to be kept on the back burner until the technology evolves.
No comments:
Post a Comment