Information Today, Inc. Corporate Site KMWorld CRM Media Streaming Media Faulkner Speech Technology DBTA/Unisphere
PRIVACY/COOKIES POLICY
Other ITI Websites
American Library Directory Boardwalk Empire Database Trends and Applications DestinationCRM Faulkner Information Services Fulltext Sources Online InfoToday Europe KMWorld Literary Market Place Plexus Publishing Smart Customer Service Speech Technology Streaming Media Streaming Media Europe Streaming Media Producer Unisphere Research



Vendors: For commercial reprints in print or digital form, contact LaShawn Fugate (lashawn@infotoday.com).
Magazines > Online Searcher
Back Forward

ONLINE SEARCHER: Information Discovery, Technology, Strategies

HOME

Pages: 1| 2
Fashion for All: Using Augmented Reality to Explore Digital Archives
By ,
January/February 2021 Issue

Chanel. Halston. Jean Patou.

Original sketches and drawings by these acclaimed fashion designers rest within acid-free boxes held in the climate-controlled facilities of the Special Collections and College Archives (SPARC) at the Fashion Institute of Technology (FIT). To view them, researchers must make an appointment in advance and are limited to the number they can view at a time.

But how can we make those collections something that more people can access and explore? How do we take the fun of historical fashion and use it to engender deeper insights?

Researching fashion can be a lot of fun: It is full of surprise and discovery. On a deeper level, though, fashion and material culture can give us an important understanding of how people lived in eras gone by. When researchers can study and visualize the process of garment and accessory design and creation, it gives us greater insight into societal mores as defined by race, class, gender, and age.

Our augmented reality (AR) project, The FITting Room” (fittingroom.fitnyc.edu), is one answer to these questions. Our project takes high-res digital scans of fashion illustrations and turns them into AR filters that allow users to “try on” an historical piece of clothing and see how it looks on them.

Increasingly, technology is bridging the gap between the public and archival material. With this project, we hope to bring archival sketches to life and increase accessibility of our collection. Snapchat brought you the “puppy” filter. Al most right on its “heels,” we are bringing you the “Halston coral hat with bows” filter.

Make it Fun

This project was first conceived after learning about AR and the Zappar (zappar.com) AR platform during a visit to FIT’s Faculty Research Space, an on-campus center for innovative technology that faculty can incorporate into their teaching and projects. Playing around on the Zappar platform led to a number of ideas of how it could be used for educational purposes. And then came a series of questions: How can we make it fun? How can we make it so students want to download an app to experience our projects?

At the FIT Library, we have an amazing archive with unique digital items such as pochoir plates from the art deco era with floral dress patterns that leap off the page. One possibility that immediately came up was that the hats and accessories in these drawings could be turned into AR face filters, a technology that is growing in popularity among 18- to 25-year-olds, the exact age range of most of our student body. Although Snapchat only introduced its AR filters in 2015, according to Statista, this year, it is predicted that there will be 1.96 billion mobile AR users globally (statista.com/statistics/1098630/ global-mobile-augmented-reality-ar-users).

Thanks to the Faculty Research Space and Zappar’s online tutorial (youtube.com/watch?v=cJ59VQiBFZk), the first work able AR filter was ready in about an hour. It was remarkable how quickly we could Photoshop the hat from a fashion sketch and then use Zappar’s face-tracking functionality to “wear” that hat on our own heads. However, it’s a long road from playing around with software to delivering a final product for the public. Here are some lessons we learned along the way.

Keep it simple

When you first discover a new program or software, you might be tempted to get really ambitious with it and all the possibilities it presents. We had a lot of ideas for incorporating information about the original sketch, the collection, the year, and the name of the designer using vintage-looking fonts. We realized it was very easy to include a button that people could click so they could view the item within our digital collections. We could even include an image of the designer’s original handwritten signature! How cool! The possibilities were endless.

Screen real estate, however, is not infinite. Smartphones have very limited screen space, and you can quickly overwhelm the visual interface with too much text. We soon learned that for any text to be legible, it needed to take up anywhere from 25%–30% of the screen. As you can see from some early examples, adding only two extra elements in addition to the hat quickly made the interface very cluttered.

There is another challenge to incorporating metadata and buttons: Do people who are using the app for fun want the screenshots of them “modeling” the attire to be cluttered with extraneous text or a giant black button across their chin? Our users probably don’t want that stuff in the way. (See Figures 1 and 2 on page 2.)

Consider all your users

One early direction we considered was incorporating make up. The highly stylized sketches had beautifully detailed faces with charming vintage appeal, such as arched 1960s eyebrows, luminous red 1950s lipstick, and Edwardian-era blush from a time before face contouring.

But this is easier said than done. As you can see from the early prototypes, this had some alarming results. Zappar has a Face Paint template that allows you to stretch an image onto a face mesh file. However, as is clearly apparent from the first image, this had a terrifyingly uncanny result. It was more like something out of Jim Carrey’s The Mask than La Belle Époque. (See Figure 3 on page 2.)

Not only was it terrifying, we realized early on that it presented a problem for anyone whose skin tone didn’t match the original image, particularly anyone with darker skin. The tech world has a well-documented history of racial exclusion when creating new technology, including a soap dispenser that refused to recognize darker skin tones (gizmodo.com/ why-cant-this-soap-dispenser-identify-dark-skin-17979 317 73), so it’s important to remember racial inclusivity. In addition to this, our school recently had a troubling incident involving racist imagery in a fashion show, so it was important for us to strive to be as thoughtful and sensitive as possible in developing our project. We realized that simply copying and pasting the sketch was not a workable solution. Any makeup would have to be done from scratch and painted on by hand, an extremely laborious process to do with a mouse without the aid of something like a Wacom tablet.

We decided the only makeup aspect that still might be worth the time to create and would still have a big impact was lipstick. One design, “Black Day Suit with Flared Jacket and Fitted Skirt,” had a particularly striking lip color (sparcdigital.fitnyc.edu/items/show/951). Using the eye dropper tool, we copied the shade of red, painted it over the lip part of the mask, and changed the opacity to 30%. (Anything higher gives you clown lips!) This is one of our most popular filters, and we do think the lipstick has something to do with it, but as you can see, it was not a straightforward path to get there! (See FIgure 5 on page 2.)

One bit of feedback we got was a suggestion from a colleague to add menswear. We hadn’t set out to exclude menswear, but we didn’t actively remember to include it when making our choices about which hats to feature. Menswear often gets the short shrift in terms of attention and coverage from the fashion industry. Once we realized we were about to make the same mistake, we went back to our digital ar chives and found some fantastic menswear to incorporate. One of them, “L’Officier Duc ” (sparcdigital.fitnyc.edu/items/ show/2882), has become our second most scanned filter. (See FIgures 6 and 7 on page 2.) It would have been a shame to have left out the perspective of an entire aspect of fashion history just by failing to consider historical menswear examples.

Pivoting in Response to a Pandemic

As we were developing our AR filters, we started to think about how we would roll out our experience. What we envisioned was, in many ways, conventional and adapted from how we had seen AR utilized in museum settings. We would print a series of high-resolution posters of the fashion sketches with an accompanying QR code to trigger the experience.

The posters would be displayed across the library and FIT campus. Keeping in line what we had learned in developing the filters, we wanted our poster campaign to be short and simple—pithy. No detailed instructions to clutter things or bore a student. They already know how these things work. We wanted the poster and the QR code to be an invitation, a mystery. “Try on a piece of history,” we would tell them, and the result would be a surprise, a smile.

However, just as we were beginning to develop our poster campaign, which we were certain would captivate the FIT community by the thousands, the world was radically changed by COVID-19. The FIT campus was shut down for the foreseeable future. (See Figure 8 on page 2.)

The major challenge in trying to re-envision our project in the new remote environment was that the popular conception of AR is inextricably tied to the smartphone and its camera: The smartphone camera scans some code or is triggered by some physical object to launch an experience. It works because the smartphone is so prevalent; it’s always in our pockets.

However, what became readily apparent in late March of 2020 is how prevalent the old, forgotten webcam would become in our lives as part of our daily videoconference meetings. This development offered us clues to how we could move our project forward: Our users didn’t necessarily need to use a smartphone.

Webcams are not widely used as part of AR experiences because the process would be incredibly cumbersome: Nobody wants to lift an iMac off the desk to point its webcam at a QR code. However, the last year has seen the rise of WebAR, technology that allows users to launch AR experiences directly from a web browser (arpost.co/2020/01/08/webar-adoption- going-mainstream-in-2020). Ostensibly, the purpose of this technology is to save a user the trouble of downloading a specific application to use AR, but it also lets you launch an AR with just a URL rather than by scanning a code. This, coupled with a new use of webcams, is where everything started coming together: Our potential users weren’t necessarily tied to one type of device.

Options for all

In the end, the solution to the problem of bringing our project into the remote environment was quite simple—a single webpage where users could launch our AR filters from any device. Are you using a webcam? Load it on the webcam. Looking at the site on a smartphone? That works too. On a desktop but prefer to experience the filter on your phone? Yup, no problem, the site can generate a QR code so you can scan the screen with your phone.

To be honest, it seems quite obvious in retrospect. The challenge, again, was breaking from the conventional notion of how AR works. Once we did that, putting together the actual webpage was only a few days’ work. We used Jekyll, the popular, static site generator, and GitHub Pages for free hosting. To design the site, we used Bootstrap, the CSS framework, along with icons from Font Awesome. (See Figure 9 on page 2.)

An advantage to using a webpage to roll out our AR filters is that including additional, educational information about the fashion sketches is more seamless. In our original conception of using a poster, we worried about clutter: too much information overloading a potential user from trying the AR experience. However, on the webpage, all that information is hidden behind a simple info icon that can be clicked (or not) to be linked to SPARC Digital (sparcdigital.fitnyc.edu), where detailed information about each fashion sketch can be accessed. Another advantage of the website is that each user can access every filter at once, whereas in our original scenario, they would have been limited to the filter corresponding to the particular poster they scanned.

The disadvantages of this method include the loss of serendipity and mystery that were part of our original poster campaign. The webpage just doesn’t work practically with out some explanation, and with that, some of the magic is lost on the user.

CONNECTING PEOPLE THROUGH FASHION

We developed this project on the premise that it would connect new and different types of people to our collections. It was the driving force throughout. However, as we put the finishing touches on our webpage, sharpened up our filters, and then finally launched, we started to realize that even though our project had “finished,” our work connecting to users would need to be ongoing.

T he final lesson we learned after reviewing user analytics was how essential ongoing marketing is to the success of a digital project. Our original poster campaign would have taken advantage of students walking to and from classes who would serendipitously discover and interact with the project. On the web, there’s no such thing as “through traffic” for siloed webpages. We quickly realized that every bump in the number of scans corresponded to the date of a promotional post on social media. It’s important to repeatedly share information about your AR project on multiple platforms multiple times. (See Figure 10 on page 2.)

Since launching The FITting Room in April 2020, more than 1,000 unique visitors across 16 countries have “tried on” our hats about 1,400 times, according to Zappar’s analytics. It’s not quite on the global scale of the “flower crown” and the “puppy” filter, but it’s been wonderful to hear the reception from users and see all the selfies people have shared. The number-one reaction we tend to get is “Fun!” which was definitely our goal: to make people smile and make fashion history fun. We hope to continue to build more filters and continue to use The FITting Room as a tool to engage students and the public.

Pages: 1| 2


Miyo Sandlin is research and instructional services librarian, adjunct assistant professor, Fashion Institute of Technology.

Joseph Anderson is assistant professor, digital initiatives librarian, Fashion Institute of Technology.

 

Comments? Contact the editors at editors@onlinesearcher.net

       Back to top