REDLab at CELA 2024: A Spotlight on Three Presentations of Innovative Research

REDLab proudly showcased its cutting-edge research at the CELA 2024 Conference with three presentations by REDLab researchers. Each contribution delved into critical aspects of urban environments, from streetscape perception to urban vibrancy and visual quality assessment. Here's a closer look at the innovative studies presented:

1. The Effect of Anthropocentric Stimuli on Perceptions of Streetscape Design
Authors: Aisha Iyengar, Jessica Fernandez, Stephen Ramos

In this study, researchers explored how human-centered stimuli impact the perception of streetscapes. By using virtual reality (VR) technology, the team explored how varying levels of human activity within a mixed-use streetscape affect people's perception of the environment. The study's methodology involved participants experiencing a 360-degree video of a streetscape through a VR headset, with the number of people visible in the environment serving as the variable. Participants then completed a survey measuring their perceptual load and sense of place. Insights from the study could inform future streetscape design decisions, particularly in busy urban environments where human interaction plays a central role in shaping user experience.

2. Creating a Comprehensive Framework for Place Vibrancy Evaluation: Integrating Human Perspectives and Spatial Dynamics
Authors: Shirin Rezaeimalek, Jessica Fernandez, Rosanna Rivero

This research addresses the complexity of urban vibrancy, a concept often studied through isolated lenses such as human activity or built environment attributes. Recognizing the need for a more holistic approach, the research proposes a framework that integrates user perceptions with spatial dynamics to evaluate urban vibrancy comprehensively.

Drawing from an extensive literature review, the presentation highlighted the importance of considering human perspectives, which have been largely overlooked in traditional vibrancy measurements. The proposed framework is expected to offer a more accurate quantification of urban vibrancy by capturing the multifaceted nature of place qualities. This approach could lead to more inclusive assessments of diverse urban spaces, ultimately contributing to the creation of livelier and more engaging public environments.

3. Measuring the Visual Quality of Street Space Using Machine Learning Modeling
Authors: Ruiqi Yang, Jessica Fernandez, Gengchen Mai, Angela Yao

In an era where technology is rapidly advancing, this study leverages machine learning to assess the visual quality of street spaces in Singapore using Google Street View images. Researchers began by conducting a survey where 578 images were evaluated for characteristics such as chaos and vibrancy. These assessments were then used to train a machine learning model (SVR) that could quantify the visual quality of street spaces based on key visual features.

The results demonstrated that the SVR model effectively aligns with human perceptions, suggesting that machine learning can be a valuable tool in landscape architecture for evaluating and guiding the aesthetic quality of urban streetscapes. This innovative approach not only enhances our understanding of visual quality in urban environments but also paves the way for more data-driven, objective assessments in the field.

By addressing both the human and technological aspects of urban environments, REDLab continues to contribute valuable insights that can help shape the future of our urban landscapes.

Previous
Previous

REDLab’s Aisha Iyengar wins CELA Research Award

Next
Next

REDLab VR Workshop: Bridging Design and Technology