Feature /atlas/ en ATLAS researchers converge at TEI’26 to showcase their work on tangible, embedded, and embodied interaction /atlas/atlas-researchers-converge-tei26-showcase-their-work-tangible-embedded-and-embodied <span>ATLAS researchers converge at TEI’26 to showcase their work on tangible, embedded, and embodied interaction</span> <span><span>Michael Kwolek</span></span> <span><time datetime="2026-03-09T09:06:26-06:00" title="Monday, March 9, 2026 - 09:06">Mon, 03/09/2026 - 09:06</time> </span> <div> <div class="imageMediaStyle focal_image_wide"> <img loading="lazy" src="/atlas/sites/default/files/styles/focal_image_wide/public/2026-02/TEI%2026%20Conference.png?h=d4c4cd0a&amp;itok=Bbs5T1Mw" width="1200" height="800" alt="TEI 2026 Conference"> </div> </div> <div role="contentinfo" class="container ucb-article-categories" itemprop="about"> <span class="visually-hidden">Categories:</span> <div class="ucb-article-category-icon" aria-hidden="true"> <i class="fa-solid fa-folder-open"></i> </div> <a href="/atlas/taxonomy/term/703"> Feature </a> <a href="/atlas/taxonomy/term/855"> Feature News </a> <a href="/atlas/taxonomy/term/144"> News </a> </div> <div role="contentinfo" class="container ucb-article-tags" itemprop="keywords"> <span class="visually-hidden">Tags:</span> <div class="ucb-article-tag-icon" aria-hidden="true"> <i class="fa-solid fa-tags"></i> </div> <a href="/atlas/taxonomy/term/396" hreflang="en">ACME</a> <a href="/atlas/taxonomy/term/1464" hreflang="en">brainmusic</a> <a href="/atlas/taxonomy/term/390" hreflang="en">do</a> <a href="/atlas/taxonomy/term/1463" hreflang="en">leslie</a> <a href="/atlas/taxonomy/term/773" hreflang="en">research</a> </div> <a href="/atlas/michael-kwolek">Michael Kwolek</a> <div class="ucb-article-content ucb-striped-content"> <div class="container"> <div class="paragraph paragraph--type--article-content paragraph--view-mode--default"> <div class="ucb-article-text" itemprop="articleBody"> <div><p dir="ltr"><span>Sound, vision, movement and touch—ATLAS researchers explore many different ways humans can interact with computers, collect and analyze data, and empower creative exploration.</span></p><p dir="ltr"><span>Nearly a dozen current and former ATLAS lab members will participate in&nbsp;</span><a href="https://tei.acm.org/2026/" rel="nofollow"><span>ACM TEI’26</span></a><span> in Chicago (March 8-11, 2026), the 20th annual conference presenting the latest results in tangible, embedded, and embodied interaction.</span></p><p dir="ltr"><span>This year’s conference theme is “Tide + Tied”. Organizers note, “By becoming a venue to bring multi-folded 'Tides' across diverse, interdisciplinary fields, the conference aims to bring researchers, designers, and artists with different backgrounds and interests together to be 'Tied,' weaving the future of the TEI community together.”</span></p><p dir="ltr"><span>ATLAS has been involved with the TEI conference since its early years, with Professor and ACME Lab director, Ellen Do, and ATLAS director Mark Gross both actively involved behind the scenes.&nbsp;</span></p><p dir="ltr"><span>Do, who is a co-author on three papers and three works-in-progress accepted at TEI ‘26, explains, “Each one of the projects is a documentation of how researchers think about ideas and how to implement them and get them to fruition.”&nbsp;</span></p><p dir="ltr"><span>She elaborates, “The conference is called Tangible, Embedded and Embodied Interaction, so a lot of work we're doing is beyond the screen. Things that we touch and put together.”</span></p><h3><span>Papers</span></h3><div class="row ucb-column-container"><div class="col ucb-column"><h4><a href="https://programs.sigchi.org/tei/2026/program/content/223849" rel="nofollow"><span><strong>Sound of Kigumi: A Playful VR Joinery Adjustment with Hammering Sound Feedback</strong></span></a></h4><p dir="ltr"><span>Kosei Ueda,&nbsp;</span><a href="/atlas/ellen-yi-luen-do" rel="nofollow"><span><strong>Ellen Yi-Luen Do</strong></span></a><span>, Hironori Yoshida&nbsp;</span></p><p dir="ltr"><span><strong>Abstract</strong>: Traditional carpentry faces a critical shortage of skilled workers due to limited opportunities for potential apprentices to access onsite woodworking experience. Through expert interviews, we learned the importance of hammering sound to judge the precision in Kigumi assembly, as master carpenters rely on differences between “soft sound" and “sharp sound" without relying on visuals. This paper presents Sound of Kigumi (SoK), a playful VR system for inexperienced users to casually experience sound sensory skills through the loop of hammering and chiseling. In SoK, users listen to hammering sound in relation to tightness, assess the precision of their work, and return to chiseling for further adjustments. Furthermore, SoK implements pseudo-haptic feedback by visually modifying hammering resistance based on chiseling progress. Expert evaluation indicated SoK replicates the hammering process and serves as an effective introductory tool, and user feedback confirmed SoK provides an immersive woodworking experience and effective Kigumi learning.</span></p></div><div class="col ucb-column"> <div class="imageMediaStyle large_image_style"> <img loading="lazy" src="/atlas/sites/default/files/styles/large_image_style/public/2026-03/Sound%20of%20Kigumi.png?itok=hzG0hFYY" width="1500" height="903" alt="Sound of Kigumi processing technique"> </div> <span class="media-image-caption"> <p><em>The first prototype: The user observes and hammers two types of Kigumi - one correctly processed without visible gaps when hammered and one incorrectly processed that reveals gaps upon hammering.</em></p> </span> </div></div><div class="row ucb-column-container"><div class="col ucb-column"><h4><a href="https://programs.sigchi.org/tei/2026/program/content/223840" rel="nofollow"><span><strong>Why (Not) ReacTIVision: Emerging Challenges and Opportunities for Building Tangible User Interfaces with Computer Vision Toolkits</strong></span></a></h4><p dir="ltr"><a href="/atlas/krithik-ranjan" rel="nofollow"><span><strong>Krithik Ranjan</strong></span></a><span>, S. Sandra Bae, Peter Gyory,&nbsp;</span><a href="/atlas/ellen-yi-luen-do" rel="nofollow"><span><strong>Ellen Yi-Luen Do</strong></span></a><span>, Clement Zheng, Rong-Hao Liang</span></p><p dir="ltr"><span><strong>Abstract</strong>: Outdated Computer Vision (CV) toolkits for Tangible User Interfaces (TUI) have led to fragmented practices, diminished reproducibility, and reduced community support. This paper examines the past, present, and future trajectory of CV-TUI toolkits. First, our scoping review of ACM literature reveals a divergence between applications using the limited interactions of established toolkits like ReacTIVision and the fragmented, bespoke systems built for complex interactions, highlighting the need for advanced toolkits that enable accessible making. Second, we present proof-of-concept applications using the contemporary ArUco fiducial marker library. We demonstrate how accessible hardware, like a top-down camera and a flat-panel display, can support a comprehensive design space of tangible interactions beyond 2D manipulation, including 3D spatial interaction, multi-device interaction, and actuated tangibles within canonical applications. Finally, reflecting on our findings, we offer six suggestions for building next-generation CV-TUI toolkits. This study provides the TUI community with an updated perspective to inform future research.</span></p></div><div class="col ucb-column"> <div class="imageMediaStyle large_image_style"> <img loading="lazy" src="/atlas/sites/default/files/styles/large_image_style/public/2026-03/ReacTIVision_0.png?itok=eSliRMj7" width="1500" height="431" alt="Sensing touch input on tokens with a capacitive touchscreen"> </div> <span class="media-image-caption"> <p><em>Sensing touch input on tokens with a capacitive touchscreen: a) Each ArUco-marked knob is augmented with a vinyl-cut copper sheet pattern. b-c) The knob transfers finger-touch inputs to the touchscreen when users interact with the token.</em></p> </span> </div></div><div class="row ucb-column-container"><div class="col ucb-column"><h4><a href="https://programs.sigchi.org/tei/2026/program/content/223837" rel="nofollow"><span><strong>HyperDance: Real-Time Vibrotactile Stimulation Feedback of Inter-Brain Connectivity in Partner Dance</strong></span></a></h4><p dir="ltr"><a href="/atlas/thiago-roque" rel="nofollow"><span><strong>Thiago Rossi Roque</strong></span></a><span>, Ruojia Sun,&nbsp;</span><a href="/atlas/ellen-yi-luen-do" rel="nofollow"><span><strong>Ellen Yi-Luen Do</strong></span></a><span>,&nbsp;</span><a href="/atlas/grace-leslie" rel="nofollow"><span><strong>Grace Leslie</strong></span></a><span>&nbsp;</span></p><p dir="ltr"><span><strong>Abstract</strong>: Building on the growing interest in technology-supported dance practice, neural imaging offers novel opportunities to reveal dancers’ internal states and expand the possibilities for augmented, embodied interaction. Despite advances in social neuroscience, the exploration of dance through brain imaging remains limited by technical challenges. To overcome these barriers, we developed and validated a real-time vibrotactile biofeedback system based on inter-brain coupling (IBC) measures from tango dancers using a mobile, synchronous multi-brain EEG system. We first conducted an empirical study recording synchronized EEG and motion data to test whether behavioral synchronization enhances inter-brain coupling. Insights from this study informed the design of our tangible neurofeedback system, which experienced dancers evaluated. Our findings support the Synchronicity Hypothesis of Dance and demonstrate how embodied technologies can enhance collective dance practice. This work introduces a novel methodological and interaction paradigm, bridging neural measurement with wearable feedback for socially situated embodied experiences.</span></p></div><div class="col ucb-column"> <div class="imageMediaStyle large_image_style"> <img loading="lazy" src="/atlas/sites/default/files/styles/large_image_style/public/2026-03/HyperDance.png?itok=TyomgJTX" width="1500" height="880" alt="Two dances wearing EEG caps"> </div> <span class="media-image-caption"> <p><em>HyperDance enables real-time measurement and tactile feedback of inter-brain coupling during natural partner dance practice.</em></p> </span> </div></div><hr><h4>Art and Performance</h4><div class="row ucb-column-container"><div class="col ucb-column"><h4><a href="https://programs.sigchi.org/tei/2026/program/content/224275" rel="nofollow"><span>Bioactuated Tapestry: Converging Textile Craft and Moisture-Responsive Biomaterials</span></a></h4><p><a href="/atlas/eldy-lazaro" rel="nofollow"><span><strong>Eldy S. Lazaro Vasquez</strong></span></a><span>;&nbsp;</span><a href="/atlas/viola-arduini" rel="nofollow"><span><strong>Viola Arduini</strong></span></a><span>;&nbsp;</span><a href="/atlas/etta-sandry" rel="nofollow"><span><strong>Etta W Sandry</strong></span></a><span>;&nbsp;</span><a href="/atlas/katerina-houser" rel="nofollow"><span><strong>Katerina Houser</strong></span></a><span>;&nbsp;</span><a href="/atlas/srujana-golla" rel="nofollow"><span><strong>Srujana Golla</strong></span></a><span>;&nbsp;</span><a href="/atlas/mirela-alistar" rel="nofollow"><span><strong>Mirela Alistar</strong></span></a></p><p><span><strong>Abstract</strong>: Bioactuated Tapestry is an installation that explores how biomaterials and textile craft unfold multiple temporalities of interaction. Structured in three zones, the installation moves from milk-based bioplastic samples that change shape quickly when misted, to a Sample Book that documents iterations of bioplastic integration into weaving, to a woven tapestry that changes shape slowly in response to humidity in the surrounding space. Together, these zones demonstrate how interaction can emerge from material behavior shaped through biomaterial formulation and, when woven, through structure. The work foregrounds biomaterial agency, weaving, and situated sustainability grounded in sourcing, fabrication, and practices of care. Through this convergence of biodesign and textile craft, Bioactuated Tapestry aligns with the TEI theme of Resurgence and Convergence, highlighting how material-led practices reconnect material experimentation, environmental attunement, and embodied ways of knowing.</span></p><p><a class="ucb-link-button ucb-link-button-blue ucb-link-button-default ucb-link-button-regular" href="https://www.eldylazaro.com/?portfolio=bioactuated-tapestry" rel="nofollow"><span class="ucb-link-button-contents">Learn More</span></a></p></div><div class="col ucb-column"> <div class="imageMediaStyle large_image_style"> <img loading="lazy" src="/atlas/sites/default/files/styles/large_image_style/public/2026-03/Bioactuated%20Tapestry%202.jpg?itok=CD47wz59" width="1500" height="1125" alt="Detail of bioactuated textile"> </div> <span class="media-image-caption"> <p><em>Detail of Bioactuated Tapestry, showing colored casein-based bioplastic strips woven through black cotton yarns. Moisture causes the bioplastic to change shape, and the weave directs that change into curling.</em></p> </span> </div></div><hr><h3>Pictorials</h3><div class="row ucb-column-container"><div class="col ucb-column"><h4><a href="https://programs.sigchi.org/tei/2026/program/content/224206" rel="nofollow"><span>Designing for the Leaky Body: Exploring Biomaterial Absorption as Body-Material Interaction</span></a></h4><p dir="ltr"><a href="/atlas/viola-arduini" rel="nofollow"><span><strong>Viola Arduini</strong></span></a><span>;&nbsp;</span><a href="/atlas/eldy-lazaro" rel="nofollow"><span><strong>Eldy S. Lazaro Vasquez</strong></span></a><span>;&nbsp;</span><a href="/atlas/srujana-golla" rel="nofollow"><span><strong>Srujana Golla</strong></span></a><span>;&nbsp;</span><a href="/atlas/mirela-alistar" rel="nofollow"><span><strong>Mirela Alistar</strong></span></a></p><p><span>Abstract: Leaking bodies are often concealed or disregarded in both society and design. Likewise, bodily fluids are rarely leveraged as triggers for material interaction in HCI. In this pictorial, we investigate how fluid-responsive biomaterials can enable porous, expressive, and cyclical interactions co-shaped by the body. We focus on a milk-derived bioplastic with reversible shape-changing properties, examining fluid absorption as a meaningful design affordance. Our material-led approach contributes both formulation and fabrication methods of casein bioplastic; while autoethnographic inquiry with a lactating body informed the development of Leaky Body Maps and speculative garments that position leakage as a generative site of body-material interaction. This work contributes to the discourse of feminist and posthuman HCI by centering bodily permeability, material responsiveness, and the potential of designing with – rather than concealing – leaky bodies.</span></p></div><div class="col ucb-column"> <div class="imageMediaStyle small_500px_25_display_size_"> <img loading="lazy" src="/atlas/sites/default/files/styles/small_500px_25_display_size_/public/2026-03/Leaky%20Body.jpg?itok=VSsJmDqY" width="375" height="610" alt="Leaky body garment prototype"> </div> <span class="media-image-caption"> <p><em>Garment prototype showing where casein-based bioplastic was placed based on a body leak map of possible milk leakage.</em></p> </span> </div></div><hr><h3><span>Works In Progress</span></h3><div class="row ucb-column-container"><div class="col ucb-column"><h4><a href="https://programs.sigchi.org/tei/2026/program/content/224242" rel="nofollow"><span><strong>ArUcoTUI: Software Toolkit for Prototyping Tangible Interactions on Portable Flat-Panel Displays with OpenCV</strong></span></a></h4><p dir="ltr"><span>Rong-Hao Liang, Steven Houben,&nbsp;</span><a href="/atlas/krithik-ranjan" rel="nofollow"><span><strong>Krithik Ranjan</strong></span></a><span>, S. Sandra Bae, Peter Gyory,&nbsp;</span><a href="/atlas/ellen-yi-luen-do" rel="nofollow"><span><strong>Ellen Yi-Luen Do</strong></span></a><span>, Clement Zheng</span></p><p dir="ltr"><span><strong>Abstract</strong>: Tangible User Interfaces (TUIs) that integrate digital information with physical interaction require specialized hardware and complex calibration, limiting their adoption in portable or mobile display systems. This paper introduces ArUcoTUI, a computer vision (CV) toolkit for prototyping tangible interactions on portable screens, leveraging standard cameras and the OpenCV library. ArUcoTUI uses ArUco fiducial markers to detect physical inputs. The software toolkit offers streamlined calibration, a signal processing pipeline, and a client application that translates tangible input into structured events for use in HCI applications. Using a conventional camera in a top-down setting with a flat-panel display, we demonstrate how this toolkit supports the development of interactive surface TUIs with advanced features, including 3D spatial interaction, multi-device interaction, and actuated tangibles within applications. We describe the software implementation, which utilizes accessible hardware to support the development of these tangible interactions. We provide the results of a preliminary evaluation with users, including design implications and suggestions for future research and development.</span></p></div><div class="col ucb-column"> <div class="imageMediaStyle large_image_style"> <img loading="lazy" src="/atlas/sites/default/files/styles/large_image_style/public/2026-03/ArUCoTUI.png?itok=jTHxkdnf" width="1500" height="287" alt="ArUcoTUI is a software toolkit for rapid TUI prototyping on portable screens"> </div> <span class="media-image-caption"> <p>ArUcoTUI is a software toolkit for rapid TUI prototyping on portable screens. It uses standard cameras, OpenCV, and ArUco markers for real-time object tracking. We demonstrate the applicability using an overhead camera for a) multi-token music control, b) above-screen gesture detection, c) multi-display board games, and d) actuated data visualization using robots.</p> </span> </div></div><div class="row ucb-column-container"><div class="col ucb-column"><h4><a href="https://programs.sigchi.org/tei/2026/program/content/224189" rel="nofollow"><span><strong>Rig-a-Doodle: Tangible Kit for Dynamic Hand-drawn Character Animation</strong></span></a></h4><p dir="ltr"><a href="/atlas/krithik-ranjan" rel="nofollow"><span><strong>Krithik Ranjan</strong></span></a><span>, Khushbu Kshirsagar, Harrison Jesse Smith,&nbsp;</span><a href="/atlas/ellen-yi-luen-d" rel="nofollow"><span><strong>Ellen Yi-Luen Do</strong></span></a></p><p dir="ltr"><span><strong>Abstract</strong>: Character animation remains challenging for novices and children despite advances in digital tools. While recent tangible interfaces have lowered barriers by enabling creators to animate their drawings on paper, they are limited to preset animation sequences and support for only human-like characters. We present Rig-a-Doodle, a tangible kit and web application for fully open-ended character rigging animation, where creators can draw any character and construct a custom physical rig using everyday materials to animate it. This work-in-progress contributes a system of tangible interaction to animate hand-drawn characters by direct physical manipulation of custom rigs in real-time. We share findings from a preliminary workshop with adults to explore the kinds of expressive animation the kit enables, discover issues with interaction, and source ideas for future directions.</span></p></div><div class="col ucb-column"> <div class="imageMediaStyle large_image_style"> <img loading="lazy" src="/atlas/sites/default/files/styles/large_image_style/public/2026-03/Rig-a-Doodle.png?itok=uGx3UhhN" width="1500" height="1212" alt="Rig-a-Doodle character template"> </div> <span class="media-image-caption"> <p>(Top-left) Rig-a-Doodle character template to draw the character and cut out CV markers for the rig. (Top-right, bottom-left, bottom-right) The three steps of Capture, Assign, and Play illustrated with screenshots from the Rig-a-Doodle application.</p> </span> </div></div><div class="row ucb-column-container"><div class="col ucb-column"><h4><a href="https://programs.sigchi.org/tei/2026/program/content/224167" rel="nofollow"><span><strong>Lighting the Reef: Modular Paper Circuits as Ecological Metaphor</strong></span></a></h4><p dir="ltr"><a href="/atlas/ruhan-yang" rel="nofollow"><span><strong>Ruhan Yang</strong></span></a><span>,&nbsp;</span><a href="/atlas/yuchen-zhang" rel="nofollow"><span><strong>Yuchen Zhang</strong></span></a><span>,&nbsp;</span><a href="/atlas/ellen-yi-luen-do" rel="nofollow"><span><strong>Ellen Yi-Luen Do</strong></span></a></p><p dir="ltr"><span><strong>Abstract</strong>: We present Lighting the Reef, an interactive installation that uses modular 3D paper circuits to explore ecological fragility. Participants build coral structures from foldable paper blocks with copper tape and low-voltage components. When connections align, coral modules glow, metaphorically expressing the energy exchange between coral and zooxanthellae, the symbiotic algae crucial to coral metabolism. Pollution modules add resistance that dims light or interrupts the current entirely, mirroring environmental disruption. We position Lighting the Reef as a Research through Design case that articulates fragility as an interaction aesthetic and ecological metaphor. We reflect on how modular circuitry, material constraints, and embodied play make precarity tangible. We also report workshops with 15 participants that discussed themes of care, collapse, and interdependence. We contribute insights into designing for fragility with modular circuits, ecological storytelling through tangible interaction, and accessible and reproducible designs for participatory sustainability education.</span></p></div><div class="col ucb-column"> <div class="imageMediaStyle large_image_style"> <img loading="lazy" src="/atlas/sites/default/files/styles/large_image_style/public/2026-03/Lighting%20the%20Reef.png?itok=kmLGkp1V" width="1500" height="1073" alt="Lighting the Reef installation built from 3D paper circuit modules"> </div> <span class="media-image-caption"> <p><em>Lighting the Reef is a tangible installation built from 3D paper circuit modules, whose illumination depends on alignment and balance. As participants assemble and adjust the blocks, the lights brighten, dim, or turn off, reflecting the changing conditions of the coral system.</em></p> </span> </div></div><p dir="ltr">&nbsp;</p></div> </div> </div> </div> </div> <div>Members of several ATLAS labs show off the latest research on human-computer interactivity.</div> <h2> <div class="paragraph paragraph--type--ucb-related-articles-block paragraph--view-mode--default"> <div>Off</div> </div> </h2> <div>Traditional</div> <div>0</div> <div>On</div> <div>White</div> Mon, 09 Mar 2026 15:06:26 +0000 Michael Kwolek 5177 at /atlas Using AI ethically: 6 tips for bringing AI tools into learning and work /atlas/using-ai-ethically-6-tips-incorporating-chatgpt-and-other-tools-how-we-learn-and-work <span>Using AI ethically: 6 tips for bringing AI tools into learning and work</span> <span><span>Michael Kwolek</span></span> <span><time datetime="2026-03-03T10:14:21-07:00" title="Tuesday, March 3, 2026 - 10:14">Tue, 03/03/2026 - 10:14</time> </span> <div> <div class="imageMediaStyle focal_image_wide"> <img loading="lazy" src="/atlas/sites/default/files/styles/focal_image_wide/public/2026-02/AI%20Ethics%20Adobe%20Stock.jpeg?h=82f92a78&amp;itok=ZG4IBr1R" width="1200" height="800" alt="woman sits at laptop studying"> </div> </div> <div role="contentinfo" class="container ucb-article-categories" itemprop="about"> <span class="visually-hidden">Categories:</span> <div class="ucb-article-category-icon" aria-hidden="true"> <i class="fa-solid fa-folder-open"></i> </div> <a href="/atlas/taxonomy/term/703"> Feature </a> <a href="/atlas/taxonomy/term/855"> Feature News </a> <a href="/atlas/taxonomy/term/144"> News </a> </div> <div role="contentinfo" class="container ucb-article-tags" itemprop="keywords"> <span class="visually-hidden">Tags:</span> <div class="ucb-article-tag-icon" aria-hidden="true"> <i class="fa-solid fa-tags"></i> </div> <a href="/atlas/taxonomy/term/773" hreflang="en">research</a> </div> <div class="ucb-article-content ucb-striped-content"> <div class="container"> <div class="paragraph paragraph--type--article-content paragraph--view-mode--default"> <div class="ucb-article-text" itemprop="articleBody"> <div><p dir="ltr"><span>How do you AI? Environmental, privacy, political, and intellectual property issues aside (and those are major issues), there are many ethics considerations involved in how we approach our own day-to-day use of AI tools.</span></p> <div class="align-right image_style-small_500px_25_display_size_"> <div class="imageMediaStyle small_500px_25_display_size_"> <img loading="lazy" src="/atlas/sites/default/files/styles/small_500px_25_display_size_/public/2026-02/AI%20Ethics%203.jpeg?itok=ZHQG-qR_" width="375" height="248" alt="Nikolaus Klassen teaches AI ethics"> </div> </div> <p dir="ltr"><span>We interviewed </span><a href="/atlas/nikolaus-klassen" rel="nofollow"><span>Nikolaus Klassen</span></a><span>, business analyst at Google and ATLAS lecturer, on the topic of AI ethics. You can read highlights from the conversation in our article, </span><a href="/atlas/exploring-ethics-ai-can-we-use-chatgpt-and-other-tools-consciously" rel="nofollow"><span>Exploring the ethics of AI: Can we use tools like ChatGPT consciously?</span></a><span>&nbsp;</span></p><p dir="ltr"><span>During our interview, Klassen also proposed tips we can use when considering how to incorporate AI tools into our work. Take a look:</span></p><h3><span>Consider hidden taxonomies</span></h3><p dir="ltr"><span>All the processed information we use in AI tools is organized through taxonomies—systems for naming, labeling and cataloging datapoints. The question that we should ask ourselves when we get an AI output is: What are the hidden taxonomies and assumptions this is built on?&nbsp;</span></p><p dir="ltr"><span>For instance,&nbsp;</span><a href="https://medium.com/@todasco/what-ai-thinks-ceos-look-like-bd8831013370" rel="nofollow"><span>a researcher found that</span></a><span> AI tools would respond to the prompt “picture of a CEO” with outputs of a (usually white) man in a suit. Nobody directly trained the models to do that explicitly, nor was it requested by the user—this is an example of a hidden taxonomy.&nbsp;</span></p><p dir="ltr"><span>That may be an obvious example of bias baked into the system over time, but there are as-yet undiscovered taxonomies in the data AI tools draw on.&nbsp;</span></p><h3><span>Note the law of the instrument&nbsp;</span></h3><p dir="ltr"><span>Abraham Maslow said, “If the only tool you have is a hammer, you tend to see every problem as a nail.” Ask yourself: Am I distorting reality to fit my tool?&nbsp;</span></p><p dir="ltr"><span>The example Klassen uses in class is predictive policing, which distorts reality by applying an algorithm fed on historical data that may be incomplete, biased, and a poor analog for the present reality. This would be a use case requiring us to distort reality to make the tool work where it does not fit the problem.</span></p><h3><span>Do a reality check</span></h3><p dir="ltr"><span>Consider what would happen if you were to act on the advice an AI tool outputs. Does this recommendation actually fit this situation? Consider why the AI tool is offering a particular choice in a specific way. This will become increasingly important as AI companies incorporate advertising into their platforms.&nbsp;</span></p><p dir="ltr"><span>Is the tool framing the choice appropriately or could there be a more ethically sound way to frame this choice?</span></p><h3><span>Hone your judgment skills</span></h3><p dir="ltr"><span>Before AI tools became ubiquitous, students and junior workers typically turned what they learned into artifacts—they would write a software function, develop a mathematical proof, draft an essay or sketch out a design. Such artifacts were the output of the mental work they did.&nbsp;</span></p><p dir="ltr"><span>Now that AI can easily create artifacts, such outputs can no longer be considered the endpoint of mental work. When artifacts are cheap, judgment becomes more valuable.&nbsp;</span></p><p dir="ltr"><span>If we do not have to build research reports, analyses, recommendations or even creative designs ourselves—as junior workers often did in many fields—we risk losing an entire infrastructure designed to train the next generation of leaders to have refined judgment and discernment skills.</span></p><p dir="ltr"><span>We must be diligent in learning to judge artifacts made by AI and determining how to iterate on and improve them.</span></p><div class="ucb-box ucb-box-title-left ucb-box-alignment-right ucb-box-style-fill ucb-box-theme-lightgray"><div class="ucb-box-inner"><div class="ucb-box-title"><span>Key ethics concepts</span></div><div class="ucb-box-content"><ul><li dir="ltr"><span><strong>Choice architecture</strong> - A deliberate design of a tool or environment that influences how people make decisions without directly restricting choice.</span></li><li dir="ltr"><span><strong>Deontology</strong> - The theory that there are absolute moral obligations that must be followed regardless of consequences, exceptions, or potential benefits.</span></li><li dir="ltr"><span><strong>Law of the instrument</strong> - A cognitive bias toward over-reliance on a familiar tool for solving problems, regardless of suitability.</span></li><li dir="ltr"><span><strong>Moral licensing</strong> - A phenomenon in which people justify an immoral action after having previously done something good.</span></li><li dir="ltr"><span><strong>Utilitarianism</strong> - The theory that the most moral action is the one that maximizes good and minimizes suffering for the greatest number of people.</span></li></ul></div></div></div><h3><span>Watch for “workslop”</span></h3><p dir="ltr"><span>Harvard Business Review recently published an article contending that&nbsp;</span><a href="https://hbr.org/2025/09/ai-generated-workslop-is-destroying-productivity" rel="nofollow"><span>AI-generated “workslop” is destroying productivity</span></a><span>. Consider overly wordy reports with hidden errors, customer service chatbots that lead users to dead ends, drab creative copy, business recommendations that presenters cannot defend. AI can make people slower when they have to wade through slop—and they often return the favor with AI-generated responses of their own.&nbsp;</span></p><p dir="ltr"><span>AI tools can feel like fast food, giving us something quick and easy that may not be very nutritious. Yet we are given no "nutrition facts"—we do not know where the output is incorrect or shows bias or fails to give the full story.</span></p><p dir="ltr"><span>Like diet and exercise, it takes conscious effort to maintain a healthy relationship with AI tools. If you want to stay mentally engaged, you have to do the equivalent of going to the gym and working out.&nbsp;</span></p><h3><span><strong>Beware the dopamine peak</strong></span></h3><p dir="ltr"><span>When making something yourself from scratch, your work builds up to the moment of completion—this creates a dopamine peak, a temporary surge in the "feel-good" hormone.&nbsp;</span></p><p dir="ltr"><span>But when AI can bring you immediately to that peak with little effort, afterwards it plunges just as quickly. You may lose motivation and not necessarily intellectualize what you just completed.&nbsp;</span></p><p dir="ltr"><span>We will do well to learn to use AI tools as a means of continued development toward mastery at craft rather than simply as time savers.</span></p></div> </div> </div> </div> </div> <div>AI tools are everywhere. We offer several ethical considerations for how and when to use them.</div> <h2> <div class="paragraph paragraph--type--ucb-related-articles-block paragraph--view-mode--default"> <div>Off</div> </div> </h2> <div>Traditional</div> <div>0</div> <div>On</div> <div>White</div> Tue, 03 Mar 2026 17:14:21 +0000 Michael Kwolek 5176 at /atlas Exploring the ethics of AI: Can we use tools like ChatGPT consciously? /atlas/exploring-ethics-ai-can-we-use-chatgpt-and-other-tools-consciously <span>Exploring the ethics of AI: Can we use tools like ChatGPT consciously?</span> <span><span>Michael Kwolek</span></span> <span><time datetime="2026-02-24T09:57:39-07:00" title="Tuesday, February 24, 2026 - 09:57">Tue, 02/24/2026 - 09:57</time> </span> <div> <div class="imageMediaStyle focal_image_wide"> <img loading="lazy" src="/atlas/sites/default/files/styles/focal_image_wide/public/2026-02/AI%20Ethics%201.JPG?h=e70b5e05&amp;itok=hbjRIK1z" width="1200" height="800" alt="Nikolaus Klassen at the front of a classroom with a slide that says &quot;Core Problem: How can we trust AI?&quot;"> </div> </div> <div role="contentinfo" class="container ucb-article-categories" itemprop="about"> <span class="visually-hidden">Categories:</span> <div class="ucb-article-category-icon" aria-hidden="true"> <i class="fa-solid fa-folder-open"></i> </div> <a href="/atlas/taxonomy/term/703"> Feature </a> <a href="/atlas/taxonomy/term/855"> Feature News </a> <a href="/atlas/taxonomy/term/144"> News </a> </div> <div role="contentinfo" class="container ucb-article-tags" itemprop="keywords"> <span class="visually-hidden">Tags:</span> <div class="ucb-article-tag-icon" aria-hidden="true"> <i class="fa-solid fa-tags"></i> </div> <a href="/atlas/taxonomy/term/364" hreflang="en">CTD</a> <a href="/atlas/taxonomy/term/1181" hreflang="en">bsctd</a> <a href="/atlas/taxonomy/term/360" hreflang="en">ctd</a> <a href="/atlas/taxonomy/term/1269" hreflang="en">msctd</a> <a href="/atlas/taxonomy/term/771" hreflang="en">phd</a> </div> <a href="/atlas/michael-kwolek">Michael Kwolek</a> <div class="ucb-article-content ucb-striped-content"> <div class="container"> <div class="paragraph paragraph--type--article-content paragraph--view-mode--default"> <div class="ucb-article-text" itemprop="articleBody"> <div><p dir="ltr"><span>As adoption of AI tools speeds up on campuses worldwide, students, faculty, and staff may be tempted to simply adopt-and-go. But it pays to consider the ethical implications of how we approach such technologies.</span></p> <div class="align-right image_style-original_image_size"> <div class="imageMediaStyle original_image_size"> <img loading="lazy" src="/atlas/sites/default/files/styles/original_image_size/public/2025-02/nikolaus_klassen.jpg?itok=15udUfPb" width="200" height="200" alt="Profile of a white man with short brown hair and a beard. He is wearing glasses and a blue dress shirt."> </div> </div> <p dir="ltr"><a href="/atlas/nikolaus-klassen" rel="nofollow"><span>Nikolaus Klassen</span></a><span>, business analyst at Google, teaches Applied AI Ethics for undergraduate and graduate students at the ATLAS Institute. With a PhD in classics and work in data processing and reporting, Klassen’s career has zigzagged between the humanities and the tech world.</span></p><p dir="ltr"><span>We discussed the ethical implications of AI tools and how students are thinking about them. This conversation was lightly edited for space and clarity.</span></p><p dir="ltr"><span><strong>If you were to distill the concept of AI ethics to a few major themes in our current moment, what would they be?</strong></span></p><p dir="ltr"><span>I think AI ethics specifically—and tech ethics more generally speaking—is often presented as a trade-off: You can use this tool for free, but we'll invade your privacy. For me that's the core of the problem, because very often it's not easy to break out of this trade-off.&nbsp;</span></p><p dir="ltr"><span>Do you look at utilitarianism, at the consequences, or do you set up unbreakable rules? Again, it’s almost like a trade-off.&nbsp;</span></p><p dir="ltr"><span>So my core approach to AI ethics and tech ethics in general is: How can we ask better questions and find better frameworks that will bring us beyond this simple trade-off between the good and the bad?&nbsp;</span></p><p dir="ltr"><span>Is there a way to offer people better choices and to offer choices in a way that [helps us] make good decisions? Instead of letting our privacy be invaded all the time and giving away our data because the defaults are set up in a certain way, how can we dig deeper and find more root causes of bias in the data?&nbsp;</span></p><p dir="ltr"><span>For me, ethics is more about how can I use these frameworks to expose structural problems and maybe make them better? Alleviate the problems or solve them where possible, rather than just accept that they're part of this bad trade-off.</span></p><div class="ucb-box ucb-box-title-left ucb-box-alignment-right ucb-box-style-fill ucb-box-theme-lightgray"><div class="ucb-box-inner"><div class="ucb-box-title">Key ethics concepts</div><div class="ucb-box-content"><ul><li dir="ltr"><span><strong>Utilitarianism</strong> - The theory that the most moral action is the one that maximizes good and minimizes suffering for the greatest number of people.</span></li><li dir="ltr"><span><strong>Deontology</strong> - The theory that there are absolute moral obligations that must be followed regardless of consequences, exceptions, or potential benefits.</span></li><li dir="ltr"><span><strong>Moral licensing</strong> - A phenomenon in which people justify an immoral action after having previously done something good.</span></li><li dir="ltr"><span><strong>Law of the instrument</strong> - A cognitive bias toward over-reliance on a familiar tool for solving problems, regardless of suitability.</span></li><li dir="ltr"><span><strong>Choice architecture</strong> - A deliberate design of a tool or environment that influences how people make decisions without directly restricting choice.</span></li></ul></div></div></div><p dir="ltr"><span><strong>Why do you think your AI Ethics class is so popular among ATLAS and non-ATLAS students?</strong></span></p><p dir="ltr"><span>I think students are pretty concerned about AI. Is it going to take away all the jobs? It seems to for entry-level jobs, so there is a direct impact. And I see students honestly grapple with how they should use AI in their own studies.</span></p><p dir="ltr"><span>People frame it as: Is AI my crutch or is it a good tool that I'm using?&nbsp;</span></p><p dir="ltr"><span>It's not like this is an abstract academic phenomenon. If you're going through your surroundings with open eyes, you can see bad impacts of unethical AI usage, so I think this is very concrete and applicable for students.&nbsp;</span></p><p dir="ltr"><span><strong>What do you hope students take away from spending a semester considering the ethical implications of AI technology?</strong></span></p><p dir="ltr"><span>For me it's really all about the questions—I want students to have a toolbox of questions they can ask and to teach them when they see a phenomenon not to just take it at face value. Be it a technology, an app, a use case, whatever their friends are using. To say, “Hold on a minute, let me ask some questions here,” and give them good questions to ask. To say, “How can I dive deeper into a problem?” and understand the root cause or the assumptions that are hidden here and sharpen these analytical tools to cut through the noise.&nbsp;</span></p><p dir="ltr"><span><strong>How do you think about AI in general? A tool? A platform? A way of life?</strong></span></p><p dir="ltr"><span>As humans, we experience these gateway transitions where we change something and then open up a new world. Agriculture enabled cities and civilizations and division of labor with all the bad and all the good [associated with that]. We suddenly could finance full-time poets and musicians and spend more resources on meaning making and culture.</span></p><p dir="ltr"><span>Then you have the mechanical engine and the revolution that came with it. We have a lot more mobility today. We don't have to work so hard. Our life expectancy has basically doubled since then. It has enabled all kinds of different ways of living.</span></p> <div class="align-right image_style-small_500px_25_display_size_"> <div class="imageMediaStyle small_500px_25_display_size_"> <img loading="lazy" src="/atlas/sites/default/files/styles/small_500px_25_display_size_/public/2026-02/AI%20Ethics%204.jpeg?itok=7sa7PxIX" width="375" height="281" alt="Nikolaus Klassen in front of a screen that says Purpose (How), Goal (What), Means (How)"> </div> </div> <p dir="ltr"><span>I think AI is probably going to be the same. The amount of information that we have in the world today is far beyond what humans can process. Because there's so much information around, it's hard to cut through it. For better or worse, we need technology to help us process it. We cannot do so on our own anymore. I think this will be the next gateway.&nbsp;</span></p><p dir="ltr"><span>Most likely we will go through a valley like we did after the agricultural revolution and the mechanical revolution with unemployment rising or people being more and more hooked on digital technology. I feel like this is happening whether we want that or not.</span></p><p dir="ltr"><span><strong>The speed of change feels unprecedented. How does ethics apply to a phenomenon that is evolving so quickly?</strong></span></p><p dir="ltr"><span>I don't think it's ever going to be too late to make AI more ethical. If you think about the industrial revolution, the life of workers got so much worse when they started to work in the factories than it was when they were working in the fields. It took 50 or 100 years or so to rectify that. And within that comparatively short time span, the life of workers was better than the life of farmers. And we probably have stronger social ethics today than we had in the 18th century, so I don't think it's impossible for AI to do that. I would expect it to happen.&nbsp;</span></p><p dir="ltr"><br><em><span><strong>Want to learn more? Check out our follow-up story </strong></span></em><a href="/atlas/using-ai-ethically-6-tips-incorporating-chatgpt-and-other-tools-how-we-learn-and-work" rel="nofollow"><em><span><strong>Using AI ethically: 6 tips for bringing AI tools into learning and work.</strong></span></em></a></p></div> </div> </div> </div> </div> <div>As tech advancements speed up, consider how best to incorporate AI tools at school and work.</div> <h2> <div class="paragraph paragraph--type--ucb-related-articles-block paragraph--view-mode--default"> <div>Off</div> </div> </h2> <div>Traditional</div> <div>0</div> <div>On</div> <div>White</div> Tue, 24 Feb 2026 16:57:39 +0000 Michael Kwolek 5173 at /atlas Inaugural Sustainability Research Initiative Research Fellows unveiled /atlas/2026/02/23/inaugural-sustainability-research-initiative-research-fellows-unveiled <span>Inaugural Sustainability Research Initiative Research Fellows unveiled</span> <span><span>Michael Kwolek</span></span> <span><time datetime="2026-02-23T13:20:47-07:00" title="Monday, February 23, 2026 - 13:20">Mon, 02/23/2026 - 13:20</time> </span> <div> <div class="imageMediaStyle focal_image_wide"> <img loading="lazy" src="/atlas/sites/default/files/styles/focal_image_wide/public/2026-02/Sustainability%20Research%20Initiative.png?h=5d0f0d5c&amp;itok=MaNEaISR" width="1200" height="800" alt="Image of globe overlayed on sustainability icons and an image of rolling hills"> </div> </div> <div role="contentinfo" class="container ucb-article-categories" itemprop="about"> <span class="visually-hidden">Categories:</span> <div class="ucb-article-category-icon" aria-hidden="true"> <i class="fa-solid fa-folder-open"></i> </div> <a href="/atlas/taxonomy/term/703"> Feature </a> <a href="/atlas/taxonomy/term/855"> Feature News </a> <a href="/atlas/taxonomy/term/144"> News </a> </div> <div role="contentinfo" class="container ucb-article-tags" itemprop="keywords"> <span class="visually-hidden">Tags:</span> <div class="ucb-article-tag-icon" aria-hidden="true"> <i class="fa-solid fa-tags"></i> </div> <a href="/atlas/taxonomy/term/729" hreflang="en">alistar</a> <a href="/atlas/taxonomy/term/731" hreflang="en">living matter</a> <a href="/atlas/taxonomy/term/773" hreflang="en">research</a> </div> <div class="ucb-article-content ucb-striped-content"> <div class="container"> <div class="paragraph paragraph--type--article-content paragraph--view-mode--default"> <div class="ucb-article-text" itemprop="articleBody"> </div> </div> </div> </div> <div>Assistant professor Mirela Alistar is a member of the first cohort of SRI Research Fellows selected by 91ý's Research &amp; Innovation Office. </div> <script> window.location.href = `/researchinnovation/2026/02/23/inaugural-sustainability-research-initiative-research-fellows-unveiled`; </script> <h2> <div class="paragraph paragraph--type--ucb-related-articles-block paragraph--view-mode--default"> <div>Off</div> </div> </h2> <div>Traditional</div> <div>0</div> <div>On</div> <div>White</div> Mon, 23 Feb 2026 20:20:47 +0000 Michael Kwolek 5175 at /atlas Inside Sh!tty Hacks: An Anti-Hackathon at 91ý /atlas/inside-shtty-hacks-anti-hackathon-cu-boulder <span>Inside Sh!tty Hacks: An Anti-Hackathon at 91ý</span> <span><span>Michael Kwolek</span></span> <span><time datetime="2026-02-05T12:54:22-07:00" title="Thursday, February 5, 2026 - 12:54">Thu, 02/05/2026 - 12:54</time> </span> <div> <div class="imageMediaStyle focal_image_wide"> <img loading="lazy" src="/atlas/sites/default/files/styles/focal_image_wide/public/2026-02/Shitty%20Hacks.jpg?h=7afb1587&amp;itok=5Wt3R-44" width="1200" height="800" alt="Shitty Hacks awards"> </div> </div> <div role="contentinfo" class="container ucb-article-categories" itemprop="about"> <span class="visually-hidden">Categories:</span> <div class="ucb-article-category-icon" aria-hidden="true"> <i class="fa-solid fa-folder-open"></i> </div> <a href="/atlas/taxonomy/term/703"> Feature </a> <a href="/atlas/taxonomy/term/855"> Feature News </a> <a href="/atlas/taxonomy/term/144"> News </a> </div> <div role="contentinfo" class="container ucb-article-tags" itemprop="keywords"> <span class="visually-hidden">Tags:</span> <div class="ucb-article-tag-icon" aria-hidden="true"> <i class="fa-solid fa-tags"></i> </div> <a href="/atlas/taxonomy/term/372" hreflang="en">BTU</a> <a href="/atlas/taxonomy/term/1181" hreflang="en">bsctd</a> <a href="/atlas/taxonomy/term/360" hreflang="en">ctd</a> </div> <div class="ucb-article-content ucb-striped-content"> <div class="container"> <div class="paragraph paragraph--type--article-content paragraph--view-mode--default"> <div class="ucb-article-text" itemprop="articleBody"> </div> </div> </div> </div> <div>Sometimes the process of making and creating loses its “fun” streak, so why not put on a 24-hour hackathon that awarded the weird, wild, and possibly destructive nature of making and engineering?</div> <script> window.location.href = `https://shawnhymel.com/3151/inside-shtty-hacks-an-anti-hackathon-at-cu-boulder/`; </script> <h2> <div class="paragraph paragraph--type--ucb-related-articles-block paragraph--view-mode--default"> <div>Off</div> </div> </h2> <div>Traditional</div> <div>0</div> <div>On</div> <div>White</div> Thu, 05 Feb 2026 19:54:22 +0000 Michael Kwolek 5168 at /atlas 2026 RIO Faculty Fellows cohort spans departments and disciplines across campus /atlas/2026-rio-faculty-fellows-cohort-spans-departments-and-disciplines-across-campus <span>2026 RIO Faculty Fellows cohort spans departments and disciplines across campus</span> <span><span>Michael Kwolek</span></span> <span><time datetime="2025-12-08T15:57:57-07:00" title="Monday, December 8, 2025 - 15:57">Mon, 12/08/2025 - 15:57</time> </span> <div> <div class="imageMediaStyle focal_image_wide"> <img loading="lazy" src="/atlas/sites/default/files/styles/focal_image_wide/public/2025-12/Rivera%20Utility%20Research.jpg?h=82f92a78&amp;itok=ihaFlGK-" width="1200" height="800" alt="Michael Rivera"> </div> </div> <div role="contentinfo" class="container ucb-article-categories" itemprop="about"> <span class="visually-hidden">Categories:</span> <div class="ucb-article-category-icon" aria-hidden="true"> <i class="fa-solid fa-folder-open"></i> </div> <a href="/atlas/taxonomy/term/703"> Feature </a> <a href="/atlas/taxonomy/term/855"> Feature News </a> <a href="/atlas/taxonomy/term/144"> News </a> </div> <div role="contentinfo" class="container ucb-article-tags" itemprop="keywords"> <span class="visually-hidden">Tags:</span> <div class="ucb-article-tag-icon" aria-hidden="true"> <i class="fa-solid fa-tags"></i> </div> <a href="/atlas/taxonomy/term/773" hreflang="en">research</a> <a href="/atlas/taxonomy/term/1511" hreflang="en">rivera</a> <a href="/atlas/taxonomy/term/1510" hreflang="en">utility</a> </div> <div class="ucb-article-content ucb-striped-content"> <div class="container"> <div class="paragraph paragraph--type--article-content paragraph--view-mode--default"> <div class="ucb-article-text" itemprop="articleBody"> </div> </div> </div> </div> <div>Assistant professor Michael Rivera is one of 18 joining the 2026 RIO Faculty Fellow cohort.</div> <script> window.location.href = `/researchinnovation/2025/12/08/2026-rio-faculty-fellows-cohort-spans-departments-and-disciplines-across-campus`; </script> <h2> <div class="paragraph paragraph--type--ucb-related-articles-block paragraph--view-mode--default"> <div>Off</div> </div> </h2> <div>Traditional</div> <div>0</div> <div>On</div> <div>White</div> Mon, 08 Dec 2025 22:57:57 +0000 Michael Kwolek 5157 at /atlas Balfour’s Memory Care patients treated to soothing Longmont Symphony experience /atlas/2025/12/05/balfours-memory-care-patients-treated-soothing-longmont-symphony-experience <span>Balfour’s Memory Care patients treated to soothing Longmont Symphony experience</span> <span><span>Michael Kwolek</span></span> <span><time datetime="2025-12-05T11:54:54-07:00" title="Friday, December 5, 2025 - 11:54">Fri, 12/05/2025 - 11:54</time> </span> <div> <div class="imageMediaStyle focal_image_wide"> <img loading="lazy" src="/atlas/sites/default/files/styles/focal_image_wide/public/2025-12/Orchestra.png?h=0e753701&amp;itok=RPullYNb" width="1200" height="800" alt="Conductor leading orchestra"> </div> </div> <div role="contentinfo" class="container ucb-article-categories" itemprop="about"> <span class="visually-hidden">Categories:</span> <div class="ucb-article-category-icon" aria-hidden="true"> <i class="fa-solid fa-folder-open"></i> </div> <a href="/atlas/taxonomy/term/703"> Feature </a> <a href="/atlas/taxonomy/term/855"> Feature News </a> <a href="/atlas/taxonomy/term/144"> News </a> </div> <div role="contentinfo" class="container ucb-article-tags" itemprop="keywords"> <span class="visually-hidden">Tags:</span> <div class="ucb-article-tag-icon" aria-hidden="true"> <i class="fa-solid fa-tags"></i> </div> <a href="/atlas/taxonomy/term/1464" hreflang="en">brainmusic</a> <a href="/atlas/taxonomy/term/1463" hreflang="en">leslie</a> <a href="/atlas/taxonomy/term/773" hreflang="en">research</a> </div> <div class="ucb-article-content ucb-striped-content"> <div class="container"> <div class="paragraph paragraph--type--article-content paragraph--view-mode--default"> <div class="ucb-article-text" itemprop="articleBody"> </div> </div> </div> </div> <div>Grace Leslie noted that "for people living with Alzheimer’s or dementia, both anecdotal and experimental evidence point to the durability of music in the brain."</div> <script> window.location.href = `https://www.dailycamera.com/2025/12/04/longmont-symphony-balfour-partnership/`; </script> <h2> <div class="paragraph paragraph--type--ucb-related-articles-block paragraph--view-mode--default"> <div>Off</div> </div> </h2> <div>Traditional</div> <div>0</div> <div>On</div> <div>White</div> Fri, 05 Dec 2025 18:54:54 +0000 Michael Kwolek 5156 at /atlas Coding with creativity: How ATLAS students think beyond algorithms /atlas/coding-creativity-how-atlas-students-think-beyond-algorithms <span>Coding with creativity: How ATLAS students think beyond algorithms</span> <span><span>Michael Kwolek</span></span> <span><time datetime="2025-12-01T11:07:35-07:00" title="Monday, December 1, 2025 - 11:07">Mon, 12/01/2025 - 11:07</time> </span> <div> <div class="imageMediaStyle focal_image_wide"> <img loading="lazy" src="/atlas/sites/default/files/styles/focal_image_wide/public/2025-12/Coding%20with%20creativity%201.JPG?h=82f92a78&amp;itok=P_R4nT1n" width="1200" height="800" alt="student works on maze program on a laptop"> </div> </div> <div role="contentinfo" class="container ucb-article-categories" itemprop="about"> <span class="visually-hidden">Categories:</span> <div class="ucb-article-category-icon" aria-hidden="true"> <i class="fa-solid fa-folder-open"></i> </div> <a href="/atlas/taxonomy/term/703"> Feature </a> <a href="/atlas/taxonomy/term/855"> Feature News </a> <a href="/atlas/taxonomy/term/144"> News </a> </div> <div role="contentinfo" class="container ucb-article-tags" itemprop="keywords"> <span class="visually-hidden">Tags:</span> <div class="ucb-article-tag-icon" aria-hidden="true"> <i class="fa-solid fa-tags"></i> </div> <a href="/atlas/taxonomy/term/1181" hreflang="en">bsctd</a> <a href="/atlas/taxonomy/term/360" hreflang="en">ctd</a> </div> <a href="/atlas/caitlin-rockett">Caitlin Rockett</a> <div class="ucb-article-content ucb-striped-content"> <div class="container"> <div class="paragraph paragraph--type--article-content paragraph--view-mode--default"> <div class="ucb-article-text" itemprop="articleBody"> <div><p dir="ltr"><span>With his computer plugged into a projector at the front of the room, Hudson Blankner, a freshman in Gabe Johnson’s Computational Foundations 1 class, displayed his final project: a trio of classic games—rock, paper, scissors; tic tac toe; and table tennis.&nbsp;</span></p><p dir="ltr"><span>The assignment required students to build an interactive game using the programming skills they’d learned over the semester, and to experiment with different problem-solving strategies—including, if they wanted, “vibe” coding, the practice of prompting artificial intelligence models to generate code.</span></p> <div class="align-right image_style-medium_750px_50_display_size_"> <div class="imageMediaStyle medium_750px_50_display_size_"> <img loading="lazy" src="/atlas/sites/default/files/styles/medium_750px_50_display_size_/public/2025-12/Coding%20with%20creativity%201.JPG?itok=ZYs77Wom" width="750" height="500" alt="student works on maze program on a laptop"> </div> </div> <p dir="ltr"><span>Blankner did try using AI, and he wasn’t subtle about his feelings.</span></p><p dir="ltr"><span>“I coded this all in one prompt, but I really hate vibe coding,” Blankner told the class. “AI is like a Division I gaslighter. It took 15 prompts to make the game look like this.”&nbsp;&nbsp;</span></p><p dir="ltr"><span>Johnson expects students to explore AI tools, as he sees more and more companies requiring software engineers to use AI to some extent to program more, and faster. But Johnson also expects students—like professional software engineers—to understand the logic behind their programs and learn to write their own code, that way they know what AI gets right, wrong, good, bad or mediocre.&nbsp;</span></p><p dir="ltr"><span>“These students, for the most part, have not programmed before—they’re coming into this class fresh,” said Johnson, who teaches introductory computer programming courses for the Creative Technology and Design (CTD) curriculum at the ATLAS Institute. “Maybe some of them had taken ‘computer science classes’ in high school, but that’s often just building a web page.”</span></p><p dir="ltr"><span><strong>Engineering meets design</strong></span></p><p dir="ltr"><span>CTD degrees are granted through the College of Engineering, where coding and computational thinking are essential skills across disciplines. At ATLAS, CTD majors build that engineering foundation alongside deep design practice, giving them a holistic and strategic approach to problem solving. Rather than following trends or treating design as aesthetics alone, CTD students learn to analyze human needs and create solutions that are usable, meaningful and durable. That means students not only learn to code, they also build skills in web development, interaction design, physical prototyping, audio and video production, digital media, theory and project management.</span></p><p dir="ltr"><span>“I think CTD students can better explain their programming work,” Johnson said. “Yes, they have the technical knowledge, but they fit that knowledge into the broader context of society, of designing for humans. Communicating what you are doing is almost more important than the thing you are doing. CTD students are able to explain not just what they did, but why and how and what else they considered. Telling a story is much richer—much more human.”</span></p><p dir="ltr"><span><strong>Creative logic in action</strong></span></p> <div class="align-right image_style-medium_750px_50_display_size_"> <div class="imageMediaStyle medium_750px_50_display_size_"> <img loading="lazy" src="/atlas/sites/default/files/styles/medium_750px_50_display_size_/public/2025-12/Coding%20with%20Creativity%203.JPG?itok=Yh_KKC73" width="750" height="500" alt="platforming game on a computer"> </div> </div> <p dir="ltr"><span>When presenting midterm projects, Johnson found first-time programmers in his Computational Foundations I class thinking outside the engineering box to solve problems. Laura Baker, a sophomore, wrestled with how to determine when a player-controlled bee had reached a flower at the center of a maze.</span></p><p dir="ltr"><span>“This was an annoyingly difficult problem that seemed so simple,” Baker said. “I took an artist's approach using a simple Boolean statement and an array of RGB color codes: If the bee was touching the appropriate color, then it stopped moving. I was very proud because I didn’t use AI to help me. I tend to lean toward artsy solutions in all of my projects. The only setback with the solution I used for the bee in the maze is that you cannot change the color of the walls of the maze because then the RGB code will not link back up to the if statement correctly. It worked for my presentation, though.”</span></p><p dir="ltr"><span>While Baker could have created a traditional hitbox math test to determine where rectangles intersect, Johnson was impressed with her solution: “She needed to figure it out, and she had a creative solution rather than the ‘right’ solution.”</span></p><p dir="ltr"><span>Of course, Johnson teaches Computational Foundations students the “right” solutions as well, but he fosters unconventional thinking because it can lead to innovation—more necessary than ever in a world driven by generative artificial intelligence.</span></p><p dir="ltr"><span>“Programming is in upheaval right now because of AI,” Johnson said. “Future programmers are going into a world where large language models and AI chatbots can do all sorts of creative-approximate stuff. Programmers need skills that AI can’t approximate. One of the main functions of a university is to teach people to think critically, because now we have machines that can do thinking-like things. So future programmers can either evaluate the machines and push back against them, or just roll over and let the machines win.”&nbsp;</span></p> <div class="align-center image_style-medium_750px_50_display_size_"> <div class="imageMediaStyle medium_750px_50_display_size_"> <img loading="lazy" src="/atlas/sites/default/files/styles/medium_750px_50_display_size_/public/2025-12/Coding%20with%20Creativity%202.JPG?itok=PtYdklk2" width="750" height="500" alt="student presents coding work"> </div> </div> <p dir="ltr"><br><span><strong>A ‘joyful experience’</strong></span></p><p dir="ltr"><span>Johnson, who also teaches introductory programming classes for computer science majors, believes creativity is necessary for coding. Far too many people, he said, see programming as “an arcane mathematical thing.”</span></p><div class="ucb-box ucb-box-title-left ucb-box-alignment-right ucb-box-style-fill ucb-box-theme-lightgray"><div class="ucb-box-inner"><div class="ucb-box-title">Creative Technology &amp; Design</div><div class="ucb-box-content"><p dir="ltr"><a href="https://colorado.edu/atlas" data-entity-type="external" rel="nofollow"><span>Learn more about the ATLAS Institute and CTD programs</span></a><span> including undergraduate major, minor and certificate; professional master’s; and PhD.</span></p><ul><li dir="ltr"><span>Online info sessions about all CTD programs are held regularly throughout the year.</span></li></ul><p><span>Prospective students can email&nbsp;</span><a href="mailto:atlascommunications@colorado.edu" rel="nofollow"><span>atlascommunications@colorado.edu</span></a><span> to schedule a student-led tour.</span></p></div></div></div><p dir="ltr"><span>That creative mindset shows up in his students’ work. Computational Foundations I students blended math and design in midterm projects, with one student creating billowing clouds moving across the sky, and another coding a Price Is Right-style Plinko game simulator—both of which present a visualization of a Gaussian distribution.&nbsp;</span></p><p dir="ltr"><span>“I teach Computational Foundations I basically in the same way I teach Computer Science 1300, except in Computational Foundations I, I have much more leeway in making it fun and design-oriented,” Johnson said. “I provide the most creative and joyful experience that you can have when learning to code, and let students figure out for themselves whether they want to learn more. And because it's so fun, many of them are enthusiastic about doing it.”</span></p><p dir="ltr"><span>Baker—who had “very minimal coding experience before starting Computational Foundations”—said her view of coding has changed dramatically since taking the class.&nbsp;</span></p><p dir="ltr"><span>“This class has shown me how creative coding is, that you can design with code and get an awesome, artistic output,” Baker said. “Coding has given me a new medium to make art with, and I’m very excited about that.”</span></p></div> </div> </div> </div> </div> <div>Computational Foundations I teaches code as a technical and expressive skill.</div> <h2> <div class="paragraph paragraph--type--ucb-related-articles-block paragraph--view-mode--default"> <div>Off</div> </div> </h2> <div>Traditional</div> <div>0</div> <div>On</div> <div>White</div> Mon, 01 Dec 2025 18:07:35 +0000 Michael Kwolek 5155 at /atlas DNA origami: unfolding genetic breakthroughs /atlas/dna-origami-unfolding-genetic-breakthroughs <span>DNA origami: unfolding genetic breakthroughs</span> <span><span>Michael Kwolek</span></span> <span><time datetime="2025-11-18T10:39:32-07:00" title="Tuesday, November 18, 2025 - 10:39">Tue, 11/18/2025 - 10:39</time> </span> <div> <div class="imageMediaStyle focal_image_wide"> <img loading="lazy" src="/atlas/sites/default/files/styles/focal_image_wide/public/2025-11/Alistar%20Living%20Matter%20Lab.JPG?h=82f92a78&amp;itok=6zqdeUqP" width="1200" height="800" alt="Mirela Alistar in lab coat with equipment"> </div> </div> <div role="contentinfo" class="container ucb-article-categories" itemprop="about"> <span class="visually-hidden">Categories:</span> <div class="ucb-article-category-icon" aria-hidden="true"> <i class="fa-solid fa-folder-open"></i> </div> <a href="/atlas/taxonomy/term/703"> Feature </a> <a href="/atlas/taxonomy/term/855"> Feature News </a> <a href="/atlas/taxonomy/term/144"> News </a> </div> <div role="contentinfo" class="container ucb-article-tags" itemprop="keywords"> <span class="visually-hidden">Tags:</span> <div class="ucb-article-tag-icon" aria-hidden="true"> <i class="fa-solid fa-tags"></i> </div> <a href="/atlas/taxonomy/term/729" hreflang="en">alistar</a> <a href="/atlas/taxonomy/term/731" hreflang="en">living matter</a> <a href="/atlas/taxonomy/term/773" hreflang="en">research</a> </div> <a href="/atlas/michael-kwolek">Michael Kwolek</a> <div class="ucb-article-content ucb-striped-content"> <div class="container"> <div class="paragraph paragraph--type--article-content paragraph--view-mode--default"> <div class="ucb-article-text" itemprop="articleBody"> <div><div class="ucb-box ucb-box-title-left ucb-box-alignment-right ucb-box-style-fill ucb-box-theme-lightgray"><div class="ucb-box-inner"><div class="ucb-box-title"><span>Lab Venture Challenge</span></div><div class="ucb-box-content"><p><span>Johnson and Alistar competed as finalists in 91ý’s 2025&nbsp;</span><a href="/venturepartners/opportunities-and-events/lab-venture-challenge#finalists" rel="nofollow"><span>Lab Venture Challenge</span></a><span> where their technology generated much interest from industry leaders.</span></p></div></div></div><p dir="ltr"><span>Access to DNA is crucial in many branches of biomedical research. But making long strands of DNA is time consuming, error-prone and expensive.</span></p><p dir="ltr"><span>Over the years, researchers have worked to make DNA synthesis more efficient, with&nbsp;</span><a href="/asmagazine/2023/06/28/cu-boulders-marvin-caruthers-wins-inaugural-merkin-prize-biomedical-technology-developing" rel="nofollow"><span>important contributions made by Marvin Caruthers</span></a><span>, distinguished professor of chemistry and biochemistry at the University of Colorado Boulder. This research has advanced a range of biomedical fields including drug and vaccine development, pathogen tests, and cancer diagnostics.&nbsp;</span></p><p dir="ltr"><span>Making DNA involves complex biochemical and mechanical processes to assemble a strand base by base. At each stage, there is a small chance of failure, but in doing this process over and over for more bases, that chance increases.&nbsp;</span></p><p dir="ltr"><span>The process of creating a DNA strand longer than 1,000 bases often takes several weeks, which can hinder research cycles. To solve this, biotech companies have pursued incremental efficiency gains in strand construction.</span></p><p dir="ltr"><span>Now researchers in the ATLAS Institute’s&nbsp;</span><a href="/atlas/living-matter-lab" rel="nofollow"><span>Living Matter Lab</span></a><span> aim to rethink DNA synthesis altogether.</span></p><p dir="ltr"><span><strong>A new way to build DNA</strong></span></p><p dir="ltr"><span>Lab director and assistant professor&nbsp;</span><a href="/atlas/mirela-alistar" rel="nofollow"><span>Mirela Alistar</span></a><span> and post-doctoral researcher&nbsp;</span><a href="/atlas/joshua-johnson" rel="nofollow"><span>Joshua Johnson</span></a><span> are working to develop nanorobots that will more quickly and accurately build DNA to meet researchers’ specifications in a matter of days instead of weeks.&nbsp;</span></p><p dir="ltr"><span>They are employing DNA origami—a creative technique for shaping these building blocks of life—to create a nanorobot to speed the process of making new DNA strands. “It folds much like paper origami, but it is made of DNA,” Johnson noted. “Our particular nanorobot is rectangular with a rotating arm element. It is about 2,000 times smaller than the width of human hair.”</span></p><p dir="ltr"><span>DNA origami research dates back to 2006, with scientists making simple but precise nanoscale shapes and patterns. Alistar and Johnson aim to apply this technique to the mechanical arrangement of molecules. “We are taking existing scientific concepts and combining them in new ways—much like engineering a normal sized robot but at the molecular scale," Johnson elaborated.</span></p><p dir="ltr"><span>Alistar explained the team’s contribution to DNA origami research as “designing the DNA structure that becomes a robot such that it is more stable, translating the fabrication process from extremely highly advanced labs to a little bit of a lower-key lab in computer science, which means we have to be inventive with a lot of the processes.”</span></p><div class="row ucb-column-container"><div class="col ucb-column"> <div class="imageMediaStyle large_image_style"> <img loading="lazy" src="/atlas/sites/default/files/styles/large_image_style/public/2025-11/Johnson%20Living%20Matter%20Lab%203.JPG?itok=xzXitK2r" width="1500" height="1001" alt="Joshua Johnson in a lab coat holding a small container of DNA"> </div> </div><div class="col ucb-column"> <div class="imageMediaStyle large_image_style"> <img loading="lazy" src="/atlas/sites/default/files/styles/large_image_style/public/2025-11/Alistar%20Living%20Matter%20Lab%205.JPG?itok=r6605LnM" width="1500" height="1001" alt="Mirela Alistar working with a machine emitting UV light"> </div> </div></div><p dir="ltr"><br><span><strong>The right place for the research</strong></span></p><p dir="ltr"><span>The ATLAS Institute’s relationships with the College of Engineering and Applied Science create space for such breakthrough research. “We are interdisciplinary—I'm confident saying that,” Alistar said. “We do work with DNA for bacteriophages. We also work with microfluidics, which is also needed for the DNA nanorobot. So there are a lot of intersections in which we saw the potential for developing a DNA origami-based project in the lab.”</span></p><p dir="ltr"><span>Sensing great promise in their research, the team is seeking a commercialization path to reach the real-world. “This nanomachine process that we developed could be substantially faster than anything else in the industry,” Johnson noted. “There is a clear market need: biotech and pharmaceutical companies wait weeks for their large DNA strands, and that slows down research.”&nbsp;</span></p><p dir="ltr"><span>According to early market analysis, these companies would be willing to pay more to get their DNA faster. “We've identified that the gene synthesis market would benefit most because they need the longest DNA, they need it the fastest and they're willing to pay the most for it,” Johnson said.&nbsp;</span></p><div class="row ucb-column-container"><div class="col ucb-column"> <div class="imageMediaStyle large_image_style"> <img loading="lazy" src="/atlas/sites/default/files/styles/large_image_style/public/2025-11/Nanosynth%20equipment%201.JPG?itok=PUdiUXS2" width="1500" height="1001" alt="Living Matter Lab rack holding scientific equipment"> </div> </div><div class="col ucb-column"> <div class="imageMediaStyle large_image_style"> <img loading="lazy" src="/atlas/sites/default/files/styles/large_image_style/public/2025-11/Nanosynth%20equipment%202.JPG?itok=PVnqLlXK" width="1500" height="1001" alt="Living Matter Lab scientist holding small sample case"> </div> </div><div class="col ucb-column"> <div class="imageMediaStyle large_image_style"> <img loading="lazy" src="/atlas/sites/default/files/styles/large_image_style/public/2025-11/Living%20Matter%20Lab%20Equipment.JPG?itok=-rCuzloX" width="1500" height="1001" alt="3D printed lab equipment"> </div> </div></div><p dir="ltr"><br><span>Alistar also noted potential in cell-free research. “A lot of development in biology goes toward using merely DNA, not living cells. Applications are mostly in vaccines.” If, for example, you could more quickly make a vaccine even in remote places, that could have major implications for global health.</span></p><p dir="ltr"><span>To commercialize this research, Alistar and Johnson are pursuing “a lot of support from 91ý and the state of Colorado in getting to an actual product,” Alistar explained. “If everything goes right, we're gonna be enrolled in a national-level program for two months of customer discovery research.”&nbsp;</span></p><p dir="ltr"><span>The team hopes to demonstrate market feasibility of their new synthesis method within three years to improve one of the main bottlenecks in biotech research and help smooth the way toward improved vaccines, gene therapy and more personalized medicine.</span></p><div class="row ucb-column-container"><div class="col ucb-column"> <div class="imageMediaStyle large_image_style"> <img loading="lazy" src="/atlas/sites/default/files/styles/large_image_style/public/2025-11/Alistar%20Living%20Matter%20Lab.JPG?itok=QM1esBAK" width="1500" height="1001" alt="Mirela Alistar in lab coat with equipment"> </div> </div><div class="col ucb-column"> <div class="imageMediaStyle large_image_style"> <img loading="lazy" src="/atlas/sites/default/files/styles/large_image_style/public/2025-11/Johnson%20Living%20Matter%20Lab.JPG?itok=ZBt9Wuon" width="1500" height="1001" alt="Joshua Johnson in a lab coat working with lab equipment"> </div> <p>&nbsp;</p></div></div></div> </div> </div> </div> </div> <div>Living Matter Lab designs nanorobots for DNA production to speed biomedical research.</div> <h2> <div class="paragraph paragraph--type--ucb-related-articles-block paragraph--view-mode--default"> <div>Off</div> </div> </h2> <div>Traditional</div> <div>0</div> <div>On</div> <div>White</div> Tue, 18 Nov 2025 17:39:32 +0000 Michael Kwolek 5153 at /atlas Minds in rhythm /atlas/minds-rhythm <span>Minds in rhythm</span> <span><span>Michael Kwolek</span></span> <span><time datetime="2025-11-11T09:33:16-07:00" title="Tuesday, November 11, 2025 - 09:33">Tue, 11/11/2025 - 09:33</time> </span> <div> <div class="imageMediaStyle focal_image_wide"> <img loading="lazy" src="/atlas/sites/default/files/styles/focal_image_wide/public/2025-11/Brain%20Music%20String%20Quartet%202.JPG?h=82f92a78&amp;itok=-iLoo4fD" width="1200" height="800" alt="Violinists with EEG caps"> </div> </div> <div role="contentinfo" class="container ucb-article-categories" itemprop="about"> <span class="visually-hidden">Categories:</span> <div class="ucb-article-category-icon" aria-hidden="true"> <i class="fa-solid fa-folder-open"></i> </div> <a href="/atlas/taxonomy/term/703"> Feature </a> <a href="/atlas/taxonomy/term/855"> Feature News </a> <a href="/atlas/taxonomy/term/144"> News </a> </div> <div role="contentinfo" class="container ucb-article-tags" itemprop="keywords"> <span class="visually-hidden">Tags:</span> <div class="ucb-article-tag-icon" aria-hidden="true"> <i class="fa-solid fa-tags"></i> </div> <a href="/atlas/taxonomy/term/364" hreflang="en">CTD</a> <a href="/atlas/taxonomy/term/1464" hreflang="en">brainmusic</a> <a href="/atlas/taxonomy/term/771" hreflang="en">phd</a> <a href="/atlas/taxonomy/term/1426" hreflang="en">phd student</a> <a href="/atlas/taxonomy/term/773" hreflang="en">research</a> </div> <a href="/atlas/michael-kwolek">Michael Kwolek</a> <div class="ucb-article-content ucb-striped-content"> <div class="container"> <div class="paragraph paragraph--type--article-content paragraph--view-mode--default"> <div class="ucb-article-text" itemprop="articleBody"> <div> <div class="align-right image_style-small_500px_25_display_size_"> <div class="imageMediaStyle small_500px_25_display_size_"> <img loading="lazy" src="/atlas/sites/default/files/styles/small_500px_25_display_size_/public/2025-11/Thiago%20Roque.png?itok=ILrvxSlD" width="375" height="592" alt="Thiago Roque"> </div> </div> <p dir="ltr"><span>Imagine the cacophony of a conversation in which everyone talks, listens and responds at the same time.&nbsp;</span></p><p dir="ltr"><span>Trained musicians performing together can make a similar set of sensory inputs and brain activity truly resonate. Though a feature of the human experience for thousands of years, interbrain synchronization when playing music is not well understood.</span></p><p dir="ltr"><span>As a member of the&nbsp;</span><a href="/atlas/brain-music-lab" rel="nofollow"><span>Brain Music Lab</span></a><span>, ATLAS PhD student&nbsp;</span><a href="/atlas/thiago-roque" rel="nofollow"><span>Thiago Roque</span></a><span> has developed novel techniques for studying these nuanced dynamics with the aim to expand our understanding not only of musical performance, but also of human-to-human collaboration and connection more broadly.</span></p><p dir="ltr"><span>In his teens, Roque fell in love with music while beginning to develop his engineering skills. “I always wanted to be an engineer because I wanted to understand how things work, mostly toys and mechanics, electrical stuff,” he said, “but at that point, I also wanted to understand music.”</span></p><p dir="ltr"><span>When he got his first electronic keyboard, he realized, “An electrical engineer designed this to make music, so I realized that I could connect both things.”&nbsp;</span></p><p dir="ltr"><span>After earning BS and MS degrees in electrical engineering at University of Campinas in Brazil, Roque came to study with&nbsp;</span><a href="/atlas/grace-leslie" rel="nofollow"><span>Grace Leslie</span></a><span> at Georgia Tech, then transferred to 91ý when Leslie opened her Brain Music Lab in the ATLAS Institute.</span></p><p dir="ltr"><span>“Thiago has been a really integral part of the Brain Music Lab,” Leslie noted. “A lot of that has to do with his engineering background—it's rare to find graduate students who have the musical sophistication to be working on these projects and can rise to the occasion when it comes to developing custom technology for the research questions that we have.”</span></p><p dir="ltr"><span><strong>Studying brains in motion</strong></span></p><p dir="ltr"><span>Analyzing brain activity in moving bodies is surprisingly challenging—standard EEG data is captured in subjects who remain still.&nbsp;</span></p><p dir="ltr"><span>Roque has studied how dancers’ brains sync when they perform together, using his electrical engineering background to develop ways to improve the quality of EEG data in moving subjects.&nbsp;</span></p><p dir="ltr"><span>To compensate for all the action involved, he sewed motion sensors into the EEG caps and modified hardware to read neck and eye movement to improve data quality. This led to more ambitious plans with an even higher degree of difficulty.</span></p><p dir="ltr"><span><strong>The string ensemble experiment</strong></span></p><p dir="ltr"><span>Having dreamed for years of being able to analyze a string quartet performing a piece of music, Roque explained, “we needed all the equipment to be precisely synchronized, so we had to design this hardware that sends triggers and synchronizes everything. I designed and assembled the printed circuit boards myself.”</span></p><p dir="ltr"><span>He spent months incorporating off-the-shelf EEG equipment, accelerometers and other sensors with custom-designed components to normalize the data and sync it between all the musicians.</span></p><div class="row ucb-column-container"><div class="col ucb-column"> <div class="imageMediaStyle large_image_style"> <img loading="lazy" src="/atlas/sites/default/files/styles/large_image_style/public/2025-11/Brain%20Music%20String%20Quartet%201.JPG?itok=18Cp8IB4" width="1500" height="1001" alt="string quartet with EEG monitors and researchers around them"> </div> </div><div class="col ucb-column"> <div class="imageMediaStyle large_image_style"> <img loading="lazy" src="/atlas/sites/default/files/styles/large_image_style/public/2025-11/Brain%20Music%20String%20Quartet%202.JPG?itok=tipHYydB" width="1500" height="1001" alt="Violinists with EEG caps"> </div> </div><div class="col ucb-column"> <div class="imageMediaStyle large_image_style"> <img loading="lazy" src="/atlas/sites/default/files/styles/large_image_style/public/2025-11/Brain%20Music%20String%20Quartet%207.JPG?itok=axCuYWbF" width="1500" height="1000" alt="string quartet with EEG caps listens to music with their eyes closed"> </div> </div></div><p dir="ltr"><br><span>The next step was finding a quartet willing to participate in the experiment. Luckily, 91ý’s&nbsp;</span><a href="/music/" rel="nofollow"><span>College of Music</span></a><span>—across the street from the ATLAS Institute—is home to several student quartets, including the ensemble that ultimately agreed to participate.&nbsp;</span></p><p dir="ltr"><span>Roque said, “We wanted to work with students here because we know they will have regular rehearsals. They will have just met each other at the beginning of the semester, so they are new to it. We are planning to measure them at the end of the semester so we can see the progress, how they develop.” &nbsp;</span></p><div class="ucb-box ucb-box-title-left ucb-box-alignment-right ucb-box-style-fill ucb-box-theme-lightgray"><div class="ucb-box-inner"><div class="ucb-box-title">The research team</div><div class="ucb-box-content"><p><span>This project has not been a solo gig.&nbsp;</span><a href="/atlas/daniel-ethridge" rel="nofollow"><span>Daniel Ethridge</span></a><span>,&nbsp;</span><a href="/atlas/daniel-llamas-maldonado" rel="nofollow"><span>Daniel Llamas Maldonado</span></a><span> and&nbsp;</span><a href="/atlas/sophia-mehdizadeh" rel="nofollow"><span>Sophia Mehdizadeh</span></a><span> from the Brain Music Lab—as well as several master’s and undergraduate students—have been instrumental in executing the string quartet research.</span></p></div></div></div><p dir="ltr"><span><strong>An interdisciplinary performance&nbsp;</strong></span></p><p dir="ltr"><span>For Roque, the ATLAS Institute offers several unique elements that make this type of research possible. “It's an interdisciplinary environment that fosters challenging research with high risks but potentially high payouts, and it's a very creative place,” he noted.&nbsp;</span></p><p dir="ltr"><span>“Thinking about the University of Colorado, I had this opportunity to enroll in this&nbsp;</span><a href="/ics/graduate-programs/cognitive-neuroscience-triple-phd" rel="nofollow"><span>triple PhD program</span></a><span>. I'm getting a PhD in creative technology and design, neuroscience and cognitive science.”&nbsp;</span></p><p dir="ltr"><span>Leslie explained how this research fits into the Brain Music Lab’s larger mission: “While we are focusing on technology and developing new technology and studying how humans interface with it, what sets us apart is our focus on the really human element to it.”</span></p><p dir="ltr"><span><strong>The next movement</strong></span></p><p dir="ltr"><span>Roque aims to continue studying this young quartet to determine if their brain activity syncs more thoroughly as they continue to perform together. He would also like to study graduate musicians and seasoned professionals to learn how interbrain coupling may change based on the experience level of the musicians.</span></p><div class="ucb-box ucb-box-title-left ucb-box-alignment-right ucb-box-style-fill ucb-box-theme-lightgray"><div class="ucb-box-inner"><div class="ucb-box-title"><span>Expanding the scope</span></div><div class="ucb-box-content"><p><span>Brain Music Lab director, Grace Leslie, recently performed a solo improvisational piece,&nbsp;</span><a href="/atlas/inside-tank" rel="nofollow"><span>Inside the Tank</span></a><span>, in the B2 Black Box Theater, integrating EEG headset and body sensors.</span></p><p><span>The lab team also outfitted several audience members with EEG monitors, giving Roque additional data to study the physiological responses of those experiencing live music.</span></p></div></div></div><p dir="ltr"><span>Roque also looks forward to bringing this technology to the stage. Plans are in the works for a string quartet performance in the spring semester with a huge visualization of live physiological data to give the audience a sense of the musicians’ synchronization.</span></p><p dir="ltr"><span>“A lot of it is developing this technology that we hopefully can use in the future to continue to study musical group dynamics,” Leslie said, “but there's also this human-computer interaction application where he's done some of the foundational research to show that we can develop brain-computer interfaces that can be social.”</span></p><p dir="ltr"><span>This research may reveal insights as to how human connection and collaboration work. Over time, it could lead to tools and techniques to improve our ability to sync with each other when working on complex tasks—whether that means performing in a string quartet, playing a team sport or simply holding a nuanced conversation.</span></p> <div class="imageMediaStyle large_image_style"> <img loading="lazy" src="/atlas/sites/default/files/styles/large_image_style/public/2025-11/Brain%20Music%20String%20Quartet%204.jpg?itok=5bpEe5cB" width="1500" height="844" alt="string quartet with EEG monitors and researchers around them"> </div> <p>&nbsp;</p></div> </div> </div> </div> </div> <div>ATLAS PhD student studies how brain activity syncs when musicians perform together.</div> <h2> <div class="paragraph paragraph--type--ucb-related-articles-block paragraph--view-mode--default"> <div>Off</div> </div> </h2> <div>Traditional</div> <div>0</div> <div>On</div> <div>White</div> Tue, 11 Nov 2025 16:33:16 +0000 Michael Kwolek 5152 at /atlas