top of page

Alison Hu — E Studio IV

Updated: Mar 29, 2022

Project 1 | Intelligence in Environments


Shared public spaces are being equipped with the Internet of Things (IoT). Data are collected, stored, used and shared. However, The data that they collect remain intangible for the occupants in these spaces.


This project focuses on investigating, understanding, and/or materializing novel modes of engagement with data through new forms of probe, prototype, and speculative design.


Reading Reflections

Molly Wright Steensons talk was a great overview of the history of artificial intelligence and its modern day impact on design and architecture, especially because I was not familiar with many of the figures she mentioned. Many of the examples she showed such as Madaline Gannon’s robot arm touched on important topics like the role of emotions and humanistic qualities in technology. Her talk raised a lot of new questions on how future developments of AI might impact design and what that world could look like, and I’m curious to hear about more examples of thoughtful uses of AI. For this upcoming project, it was also really helpful to hear about positive interactions and impressions people have with data, particularly with Mimi Onuha’s work on missing data sets.


For the Anatomy of an AI System, I thought the part about the lithium and volcanoes was particularly interesting because I’ve never really considered how AI fits into our “natural” system or world. I typically tend to categorize technology as human-made and the natural environment as a separate system on its own, so this article’s perspective of “linked visible threads of commerce, science, politics, and power” was really refreshing. This reading made me wonder whether the general public would become more accepting of AI if it is viewed as a “media technolog[y] understood in the context of a geological process.” The interconnections of earth, AI technology, and labor and social structures reminds me a lot of our Scale unit in sophomore E studio; from analyzing the system from a micro to macro scale, you can begin to really understand how vast a subject is.

In David Rose's Enchanted Objects reading, I really enjoyed reading about the topic of Calming Objects. I feel like conversations in design often focus on efficiency, innovation, and production because we typically think of “solving a problem” when designing products, services, or systems. The reading made me wonder if designing for the “universal thirst for full engagement” is always the most effective way to approach a design; in order for technology to be fully incorporated into our daily lives, it will be challenging to use it to help design for stillness and calmness. I also wonder to what extend each of the six future fantasies overlap with each other; the objects with digital shadows reminded me of the Anatomy of AI reading because it highlights how technology can entirely alter an environment or system and introduce new experiences and products. How does an increase in hackable objects impact products’ digital shadows in our future society? Instead of having objects designed around human behavior, at what point might these enchanted objects start to change how humans behave on a daily basis?


Considerations + Initial Thoughts

These readings prompted a lot of questions about the relationship between a smart system and a user's behavior. I'm curious to know what kind of activities are unique to the TCS Hall that would be effectively impacted or augmented by smart systems and how the context of a working space influences the functions. The data that is being collected by the Mites sensor can either be used for a certain function or visualized in way that naturally produces new interactions or engagement in the public space.

Active Engagement

From a more practical sense, IoT devices can increase productivity and collaboration among students by making small actions more convenient (ex. checking a room to see if it has occupants). In a more exploratory sense, the users could interact with the data in a way that implies sensing and tracking but is displayed in a different format.

  • How does the gamification of data increase public approval for tracking and sensing?

  • How can data be represented in more playful ways that are easier to digest?

  • How can a smart system increase interaction and collaboration within working spaces?

  • What priorities do people have in working spaces? Privacy? Productivity?

One example of a form of active engagement with passing users is this interactive billboard Quebec displayed as a form of jaywalking awareness. While the ultimate goal of the billboard is to raise safety awareness when crossing the road, the interactive display immediately caught people's attention and implied that their movements were being tracked.


Another cool example of this is this Morphing Clay display by Google that we talked about in class last semester. The goal of this experience is to engage younger generations with traditional Chinese ceramics by using a machine learning platform to shape pottery vases to real-time poses and gestures.



Ambient Engagement

I'm also interested in the idea of ambient devices or nudges that provoke or change a behavior. I'm curious to explore how the subtle display of a behavior, pattern, or statistic can alter people's actions and decisions.

  • How does the ambient display of data or information nudge a person to change an interaction or behavior?

  • How can you capture the attention of a user and help them draw connections to the meaning of the data?

  • What patterns and behaviors already exist within TCS Hall that a smart system can fluidly integrate with in an unobtrusive way?

  • How does an awareness of one's contribution of data change their decisions?

  • What data or information would people enjoy to learn about themselves or the spaces around them?

David Rose's Office Furniture for Future Collaboration is a great example of a subtle display of information in a working space. Description: The Balance Table hopes to subtly encourage more conversational mindfulness by illuminating a constellation of lights in front of the speakers using the most airtime. They glance down, see the imbalance on the table, and naturally encourage others to contribute.


Another example that explores a similar topic is this paper written by researchers from the Pervasive Interaction Lab that explores case studies of ambient displays that nudge people to change their behaviors. One of the example case studies describes:


An example is a sculpture that sits near a person’s computer monitor and slumps over

if that person continues to sit without taking a break. After taking a break the sculpture sits upright and is assumed to be healthy. The way this kind of ambient display is assumed to influence is by raising people’s awareness of a particular behavior that they normally overlook or try not to think about.


The main study was an ambient display of data on how many people take the stairs versus the elevator. While the goal of the study was to discover what forms of data representation are most communicative, it made me think about how these small displays of information in a public space can influence or change people's awareness of their surroundings. As one of the techniques (Follow-the-Lights) features a physical pattern of LED lights that follows people's movements through the space, it made me think about how to utilize a system so that it is well-integrated and delightful to notice or experience.

After visiting the TCS Hall to get a better understanding of the space, I sketched out some interactions that can occur in a shared work environment. I wanted to explore different possibilities using the sensor's varying functions


1. The first interaction I sketched out was a form of active engagement that takes a playful approach to a functional problem. Students who are looking for places to work can view a room's occupancy through a projection of abstract shapes on the outside of work rooms.


2. The second interaction I imagined focused more on subtle engagement with the user to change a behavior or promote a better work flow. When students pull out a chair, a furniture or sculpture becomes brighter so that the work space becomes more illuminated. This allows for increased collaboration and encourages students to gravitate towards each other to share working spaces or allow for focus for individual students.


Feedback

In class, I received a lot of feedback on different directions to move towards. It was helpful to know which concepts felt engaging and had potential, and many of my peers pointed out different interactions to consider with my ideas.

Feedback Takeaways:

  • The implementation of concept 1 doesn't necessarily need to be through a projection or display; what other objects or forms can achieve the same reactions and functions?

  • Consider having multiple tables or objects interact with each other across the system instead of just within one area

  • How can I help users draw connections between the data being drawn and the interaction being shown? How can I influence the way they perceive these connections?


Thursday, January 27

Reading Reflections

I liked the way IoT Data in the Home: Observing Entanglements and New Encounters described data as “lively and affecting.” This description made me think a lot about the type of reaction I would want users to have when noticing or engaging with my interaction, particularly when it comes to the idea of a “living” building that creates a responsive environment for users. When the reading mentioned that data requires narratives to contextualize meaning and shape, it made me think about what types of narratives about data are more accepted than others. For example, the Spotify Wrapped feature releases personal information and statistics about its users’ listening patterns, yet it is one of the most highly anticipated releases each year. The idea of having different interpretations of the same data made me wonder: in what contexts do users feel comfortable or uncomfortable seeing data about themselves? What makes learning about your own data enjoyable and shareable? Each of the different approaches in the reading made me curious to know how much of the natural world—and interactions within it—needs to be mimicked or replicated in order for users to feel comfortable with data.


Data Materiality Episode 4: Yanni Loukissas on Understanding and Designing Data Settings was a really interesting perspective on data as a whole. The idea of “data settings” versus “data sets” speaks to the importance of perception, and it makes me wonder whether people’s skepticism of data was inevitable. Similar to the process described in The Anatomy of AI, I wonder if more people would be accepting of data collection if it were viewed as a natural process that is intertwined with natural human behavior. As a designer, I'm curious as to how we can make users draw these connections in a system. At the end, Loukissas’s warning against working with only the data and not people made me think a lot about our current project and the importance of visiting the TCS hall and seeing the space myself. Moving forward, I’m wondering what patterns of behavior occur in the building that I can effectively incorporate into my interaction to make sense of the data being collected.


Research Question Development

After sharing concepts, I teamed together with Eric and Rachel to tackle the direction of ambient displays as a form of data representation. As we brainstormed different concepts and areas we wanted to focus on, we highlighted the main outline of our merged idea:

  • Objects communicating with each other (similar to Invisible Roommates)

  • A responsive, live environment that is connected to people's motion (similar to David Rozin's work)

  • Ambient displays that help people draw connections from their behavior to a change in their environment (similar to this case study)

Based on these overlapping ideas, we narrowed it down to a form that picks up on motion and creates an ambient reaction that catches the user's attention. This reminded me of the slouching statue example mentioned previously from this case study. I sketched out the interaction (below) to demonstrate the idea of an object nudging a change in the user's behavior using tracking data.

An interaction like this would occur for the duration of the user sitting at the work space. However, our team also discussed the impact of representing data collected over long periods of time. For example, what if users could see a physical representation of how many students had studied there earlier that week? Month?


As we moved forward with narrowing our concept, we began refining the research question and building out further considerations.

  • How does the ambient display of data or information heighten a user’s awareness of their behavior?

  • How can an environment respond to individuals and their actions within the space over both a short and long term period?

  • How can AI make people aware of their movements through a space? How can motion be translated into structure(s) that connects to people within a space (this short or long amount of time)?

  • In a shared work environment like TCS, how can AI aid people in their work or help create a more comfortable work environment? Either for easing stress or assisting in tasks.

  • How can AI and this data support people (students) studying and work? In what ways can it help track and manage time for individuals.

We thought about how the building could respond to an individual's movement by providing short term and long term feedback on the data being collected. As students watch long term data being collected over time, the object/form of represented data can show information that people may be interested to know such as number of students who have studied there, etc.


After our brainstorming session, I sketched out an example interaction that touched on the main ideas we discussed. While there were many considerations that still need to be factored in, the overall concept was an ambient reaction to a student's movement that represented short term and long term data collection.

Moving forward, our team talked about the possibility of different interactions in two main areas of the buildings: halls/stairs and public working spaces. Depending on our ultimate goal of the user's behavior, we could explore different ways to engage user's focus through small movements or visual displays.


Based on the main outline of our concept, we created a research page of guiding questions, possible interactions, and factors to consider.




Feedback

  • Main questions and subquestions are really interesting; try to narrow it down by next week

  • The green plants in the space and flowers on the wall could be interesting in terms of putting a trace of yourself in that space; it almost creates a virtual 3D environment where all these traces live and grow

  • It feels like if a student visit the same space, they could see the plants floating in the hall space through a phone or glasses; this builds on the idea of data over time

  • Curious about the idea of participation or group interaction


Tuesday, February 1st

Reading Reflections

The reading Designing the Behavior of Interactive Objects describes the method of Personality, in which designers develop the behavior and aesthetics of an object’s interactions. This paper highly emphasizes the importance of designing for the what, how, and why of an object as opposed to focusing only on the function; by approaching an object’s design from each of these variables, we can find ways to better resonate with users to achieve the intended goals of a design beyond its functionality. I really like the idea of assigning “stereotypes” to an object or experience because it outlines the aspect of a design that most resonates with the user. For example, with the sofabot, many of the participants’ stereotypes felt like potential personas that could make the interaction more intuitive and the object more memorable. Titles likeThe Attention Seeker could help guide visual characteristics that can make the object stand out more, and the interactions that resonate with users could include large movements or sounds. This reading reminded me of the branding unit from last semester that described brands as a way for people to make sense of the world. As people naturally gravitate towards objects and experiences that make sense to them, catering towards these pre-conceived ideas and stereotypes can help refine a design and establish a stronger memory of its interaction and behavior.


Refining Our Concept

After our last feedback session, my team began working to narrow down our research space. Looking at our individual storyboards, we merged the concepts together to layout a more defined direction.

As our concept moved closer towards virtual/augmented traces of student presence in the space, we were heavily inspired by many of the projects from Everyday Experiments. This Optical Soundsystem project helps users visualize the way sound travels through a home. We wanted to incorporate this subtle, yet informative way of visualizing student behavior in TCS similar to how the optical soundsystem allows users to visualize the music in their close environment. This system introduces ways to inform the user of the data in the environment and spark curiosity about behavior within the space.


Because we were interested in both a short-term interaction/nudge with an object and a visual representation of long-term data of multiple users, it was really helpful to look at old case studies with IoT objects and pair it with projects from Everyday Experiments. However, one major question to tackle in our experience was finding the best way to transition from direct engagement with the user's data to encouraging users to view other data and behavior from students over time.

Based on the overall idea of our interaction, we started looking into example case studies, visual inspiration, and different tools to start low-fi prototyping our concept. Our three words that we chose to describe the personality of our object/experience are whimsical, gentle, and encouraging.


Our group began creating some low fidelity physical and digital prototypes of different aspects of our interaction. We created a Figma board to collect all our progress and video concepts. Rachel created a gif of the growing plant orb concept to demonstrate the idea of object interaction, and we started to brainstorm some ideas for the physical object the student would interact with.

Digital AR Prototype

I created a video prototype of the student data garden in AR using Adobe Aero. While the interactions would need to be flushed out, the overall idea is an AR platform that allows student to view their own data/plant and browse other students' data. This would allow for them to see student behavior in the space over time and compare their own behaviors to others, ultimately becoming more informed about the environment.


Moving forward, we want to focus on working out the object-to-AR transition to make sure the interactions are all intuitive and helpful. How can we inform students how to engage with the data in AR? What features would be helpful for students to use to get a better understanding of the data?


Not only do we want to increase the fidelity of our prototypes to better represent our concept, but we also want to ensure that our designs guide and inform the user in an unobtrusive way. I'm curious to explore types of expansive materials that will grow into physical objects/plants and return to its previous position. While there is the option to move our interaction entirely to AR, we wanted to ensure that we could really capture student attention with a physical object and interaction.


Feedback

  • Thinking about individual and collective information is really interesting — there is an idea of collective behavior change that can be viewed that has potential

  • Pick one aspect of behavior and one sensor or kind of data that can the focus here

  • What is the main incentive of viewing the information of behavior in the space?

  • Are there other things that could be documented other than duration that can be played with to better inform the users?

  • Is there a need for AR? Or can this aspect of the interaction be solely captured through the object itself?

  • The plant itself could be more abstracted (unless the choice for the context is justified); the word "whimsical" could define the form of the plant. It also could be partially digital and partially physical


Thursday, February 3rd

Reading Reflection

Giorgia’s Lupi talk, Speak Data, was a really refreshing perspective on creative ways to view data. Her point about data just being an “abstracted form of our reality” made me think about how many of our projects are focused on making data more approachable and humane. This perspective was interesting because it implies that data itself represents an almost neutral or harmless representation of reality; this made me wonder whether it would be more effective to communicate the idea that data is more of a translation than a representation of behavior in our projects. I also really loved the way she emphasizes creativity in data to reach a wider audience (through clothes, etc.), and it reminds me of our conversation in class today about how not all data needs to be seen as practical and functional. There are other ways for people to familiarize themselves with data without solely focusing on efficiency and productivity.


Values in Our Concept

After Isha's talk during class, we discussed the values and takeaways from our concept to really understand what areas needed development. Based on our feedback, we felt that awareness was our largest value or takeaway from the interaction. We want the user to feel a deeper connection to their own behavior and the space around them, including the behavior of others. We hope that this newfound awareness will prompt the user's curiosity and invite them to see data in a more natural and integrated light.


This project called Data Domestication has a similar value of awareness as it creates a physical representation of collected data for users to better understand their environment. I really liked the abstract/open interpretation that the physical form has (bird cage) because it has a more distinct personality and has the appeal of creativity. It made me think a lot about our plant/orb concept and ways we might be able to represent the form in appealing and engaging ways for students in the work hall.


I also found Giorgia Lupi's physical wall and AR installation for Starbucks in Milan to be closely related to the latter half of our interaction. Her project featured a brass wall on the Milan Starbucks Reserve Roastery that allowed users to learn more information about Starbucks' history via augmented reality. I really liked the way the wall had multiple layers of reaction/engagement from viewers (see bottom right image).

This project made me think about how we could incorporate these reactions and attention to detail in our own project in order to successfully encourage users to learn more about the data being collected in the space. Both of these projects helped highlight the importance of aesthetic and personality in resonating with users in a space.


With the main structure of our concept outlined, we started to narrow in on the interactions themselves. We brainstormed a variety of different interactions and forms of implementation that would achieve our intended reactions.

Feedback

  • The physical manifestation of the data is really interesting

  • Even though we are leaning towards the behavior of the person, there's a potential in the idea of a hanging plant coming down based on motion or sound

  • The concept of a data garden is also interesting

    • It's important to remember the AR is not about the plants but the data being collected in the space

    • How do i learn that data is being collected about me? or shared?

      • What type of data is being collected?

  • What do i get out of using the ar experience?

    • If the information is changing over time, it can be engaging have the user observe it several times

  • Maybe instead of having the plants on the wall, they can be closer to the mites sensor so people can be aware of the location of the sensors


Tuesday, February 8th

Building Out the Interactions

After our exploring a breadth of different interactions that conveyed our concept, we had many different brainstorming sessions to make sure we were really understanding the goals and advantages of our idea.


Our team had a challenging time balancing the relationship of physical/digital interactions as we tried to figure out how much of the experience should be based in AR or in the space itself. We started to look back at the interactions we had previously outlined to identify the strengths and weaknesses of each idea, but it was hard to "solve" one weakness without sacrificing another part of the interaction. To look for directions to move in, we started building out thoughts, questions, and considerations of our variety of ideas and created inspiration moodboards to expand our ideas on possible interactions.



After long discussions, our team decided to revisit our original interaction of a physical representation of personal data and a digital representation of the collective data of behaviors. We wanted to get a better understanding of the main goals and takeaways of our concept in order to narrow and refine the objective of our experience. Although we were originally too focused on the how aspect of the project, we needed to work through why we chose this direction/concept and what takeaways a user would have from it.

From here, we sketched out some more concise interactions that could better illustrate our concept of building a mutual relationship with your environment. While we still need to work out the details and physical form of our concept, it was helpful to identify our main value in our interactions.

Feedback

  • Adding in the different components of the wind, light, etc. as representations of the data could be confusing for the user and create too many steps

  • Consider the balance of representation of data in the AR app/platform. How will the users know the connection of the interaction and the Mites sensors?

  • There could be an AR trail that connects the plant to the location of the Mites sensors or vice versa; this can help bridge any misconceptions and heighten their awareness of the sensor locations


Thursday, February 10th

Tangible Interaction Analysis + Form Development

In class, we did a workshop analyzing the tangible interactions of case studies that had similar experiences as our current project. We laid out a set of cards that encouraged us to approach interactions with different mindsets. After learning about similar case studies and other projects, our group revisited our own concept to see how we could build out our interactions to better connect with the user.

This activity was helpful for our team to better understand what contact points are important for the user and what objects/components are involved in our interaction. For questions that we answered "no" for, it was important for us to brainstorm ways we could build it out to be more impactful or understand why we were not including certain "cards" in our idea.


Answering the cards and questions reaffirmed our goal of including a physical object to interact with in our concept. Although the AR part of our idea would help users learn more about data, we really wanted to include a physical representation of data that would inform the digital representation. Not only would this be more approachable and accessible, but it also allows us to explore the form and arrangement of the plant indicator itself.

Afterwards, we continued to build out our ideas for the form and looked into case studies/ways to make our interaction possible. I also started sketching out some different plant forms as we tried to find a feasible form that resonated with people.



Tuesday, February 15th

Systems Diagrams + Prototyping

Next, we build out a systems diagram of all the components of our concept to layout the interactions and objects. We broke the main components down into the data system, the data connected to the plant, and the plant to AR experience.

I then started to prototype some physical forms based on our sketches and previous conversations. It was important for us to start making physical prototypes to measure the scale and feasibility of our plant object.


I started with folding some paper prototypes of different flower forms to play with form and scale. We liked the idea of a lotus flower/flower opening, but we also had to make sure the movement could be implemented into the object.


As we started to look at the interaction of the plant movement, I started with a foamcore prototype with a blooming/closing interaction.



Finally, in preparation for our next critique, we created a short presentation that summarized our concept and reviewed our progress so far. After our progress presentation, we received a lot of insightful feedback from our peers and professors.


Feedback

  • Why AR? Consider pepper’s ghost instead of phone AR (or, how do we pitch that AR is the best solution?)

Based on our feedback, we started to think about the necessity of AR and began exploring other options for the digital representation of data.


Tuesday, February 22nd

Prototyping + Form Development

After our last feedback session, we laid out three options for connecting the digital data to consider other options beyond AR. While we still liked the idea of AR connection to the physical space, we wanted to at least explore the possibility of a more elegant transition between physical to digital.

Then we looked into existing solutions/examples of projecting digital information through unique forms or displays.

After a lot of discussion and white-boarding sessions, we decided that due to the nature of the space, it would be challenging to set a projector up so that it integrates naturally with the work area. Additionally, because our target space is the second floor of TCS, without a wall behind the table that the students work at, it would be challenging to project a light/form into the open air.


This conversation pushed us back to the AR idea, but potentially in the form of a magnifying glass or physical device. Because we were focused on the concept of physicality, it was important for us to start narrowing in on the object form. We started looking at different architectural structures and plant forms to inspire our physical object.

Our team split up to sketch some different design/interaction ideas for our plant to explore unique and subtle movements for our structure.

Despite exploring a wide breadth of ideas, we realized that it would make the most sense to abstract the form and keep the movement as simple as possible. With the scope of the project and our new goal to incorporate a physical AR device, it was important for us to stick to a form that was feasible and abstract.


We decided to focus on the simple plant form that would involve a sliding interaction of plant leaves with a removable, handheld AR device. Because we wanted to create a working physical device, we wanted to keep the movement as simple as possible. This would also allow us to further explore branding the object and also provide a more seamless transition from the physical to digital aspect of our concept.


In the next few days, we worked on prototyping the form, interaction, and AR part of our final idea. We wanted to explore ways to integrate the AR device as part of the object.

We started out with foamcore prototyping to get a sense of the scale and form of our final design. It was important to consider the order of the layering and the positioning of the phone for our concept AR device.


While we started to model the petals with the initial intent of 3D printing them, we switched gears and decided that laser cutting would be best for prototyping and combining with Arduino. This way, we could iterate on the petal scale and form while adjusting the base to our final pieces.

Thursday, February 24th

Making, Refining, Branding

As we continued to move forward with reforming the form to make it functional, we had a lot of brainstorming sessions to understand the best way to create the sliding mechanism. In the end, we decided to slot the moving/stationary petals towards the back while the front petal with the AR device is lodged at the front. This way, we could nudge users to pick up the device to explore the location of their data being collected.

As we continued to make progress with making the physical form, I started AR prototyping in Adobe Aero to connect the object to the sensors digitally. We wanted to stick to the idea of particles/orbs to have more visual consistency with the light orb form at the front of the plant, so I first started playing with spheres and balls in a physical space.

View from the handheld AR device

However, I wanted to explore the idea of particle movement to convey the relationship between the data and the sensors in a more dynamic way. I created a model in Blender to export to Adobe Aero to see what the particles could look like when animated in space.

After exploring the movement and direction of the particles, I exported it to Adobe Aero to view what the particles could look like in an AR world.

Although the particles exported in low quality due to the number and size of the file, I liked the idea of movement towards the ceiling along and felt that the reverse effect added to the sense of relationship between the data and the Mites sensors. Afterwards, I tested out the AR movement in TCS to see what it would look like in the space.

Meanwhile, Eric focused on developing the technical side to our physical prototype while Rachel built out the visual branding of our plant object. Moving into the final stages of refining our concept, we started discussing the visual identity of our plant object.

In class, we continued to make progress on each aspect of our concept to create a more cohesive experience. First, we started to outline different forms for our base so we could see which shape resonated with us the most. We decided to stick with a rounded base to maintain the organic structure of our object.

On my end, I reduced the number of particles in the 3D model and increased the scale so it would appear less like dust and more like small orbs. I then exported the new particle movement to judge the scale and to see what it looked like in the handheld device.

Afterwards, I exported it to AR to see what the particle movement looked like when originating from our plant prototype.

Tuesday, March 1st

Making, Refining, Branding

From here on out, we focused all our time and attention to crafting our final prototype to pull the pieces all together. We started out by laser cutting a cardboard prototype of all of the components of our model to make sure the form and function would carry into our final product.

After a successful cardboard prototype, we started to work on creating the rest of the model at a high fidelity. We started by laser cutting and attaching acrylic pieces together, sanding them down and adding spackle to the edges to create a cohesive form. Then we sanded them down again to make sure the leaves would be polished and solidified.

Afterwards, we spray painted the leaves so they would be opaque and have consistency across the acrylic and 3D printed parts.

Once we had the leaves painted, the rest was centered around 3D printing the base to fit the parts and secure the mechanism.

To supplement the progress of our final model, we discussed the overall user journey and the emotions throughout the experience. Rachel mapped out the general flow and thoughts of the interactions in TCS. After receiving feedback in class, it was really important to frame the user journey around the emotions of the user itself and less about the object.

Finally, we moved to assemble all the parts together to create a final prototype of our object. This way, we could begin filming in the TCS Hall to create a concept video and begin outlining the materials needed for the final showcase.

Tuesday, March 15th

Finalizing, Refining, Presenting

In the last stretch of the project, we focused on filming our final demo to illustrate our concept and final prototype. We filmed at TCS Hall to set the context and help the audience become familiar with the space.


Final Photos

For the final presentation, we also updated the user journey map and laid out our process work to support our concept.

Final Video



























Comments


bottom of page